Search results for: synthetic data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24971

Search results for: synthetic data

24851 An Enhanced SAR-Based Tsunami Detection System

Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah

Abstract:

Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.

Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter

Procedia PDF Downloads 356
24850 Paper-Based Detection Using Synthetic Gene Circuits

Authors: Vanessa Funk, Steven Blum, Stephanie Cole, Jorge Maciel, Matthew Lux

Abstract:

Paper-based synthetic gene circuits offer a new paradigm for programmable, fieldable biodetection. We demonstrate that by freeze-drying gene circuits with in vitro expression machinery, we can use complimentary RNA sequences to trigger colorimetric changes upon rehydration. We have successfully utilized both green fluorescent protein and luciferase-based reporters for easy visualization purposes in solution. Through several efforts, we are aiming to use this new platform technology to address a variety of needs in portable detection by demonstrating several more expression and reporter systems for detection functions on paper. In addition to RNA-based biodetection, we are exploring the use of various mechanisms that cells use to respond to environmental conditions to move towards all-hazards detection. Examples include explosives, heavy metals for water quality, and toxic chemicals.

Keywords: cell-free lysates, detection, gene circuits, in vitro

Procedia PDF Downloads 365
24849 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 175
24848 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

Authors: Jianwei Ma, Diriba Gemechu

Abstract:

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Keywords: anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, split Bregman algorithm

Procedia PDF Downloads 185
24847 Quest for an Efficient Green Multifunctional Agent for the Synthesis of Metal Nanoparticles with Highly Specified Structural Properties

Authors: Niharul Alam

Abstract:

The development of energy efficient, economic and eco-friendly synthetic protocols for metal nanoparticles (NPs) with tailor-made structural properties and biocompatibility is a highly cherished goal for researchers working in the field of nanoscience and nanotechnology. In this context, green chemistry is highly relevant and the 12 principles of Green Chemistry can be explored to develop such synthetic protocols which are practically implementable. One of the most promising green chemical synthetic methods which can serve the purpose is biogenic synthetic protocol, which utilizes non-toxic multifunctional reactants derived from natural, biological sources ranging from unicellular organisms to higher plants that are often characterized as “medicinal plants”. Over the past few years, a plethora of medicinal plants have been explored as the source of this kind of multifunctional green chemical agents. In this presentation, we focus on the syntheses of stable monometallic Au and Ag NPs and also bimetallic Au/Ag alloy NPs with highly efficient catalytic property using aqueous extract of leaves of Indian Curry leaf plat (Murraya koenigii Spreng.; Fam. Rutaceae) as green multifunctional agents which is extensively used in Indian traditional medicine and cuisine. We have also studied the interaction between the synthesized metal NPs and surface-adsorbed fluorescent moieties, quercetin and quercetin glycoside which are its chemical constituents. This helped us to understand the surface property of the metal NPs synthesized by this plant based biogenic route and to predict a plausible mechanistic pathway which may help in fine-tuning green chemical methods for the controlled synthesis of various metal NPs in future. We observed that simple experimental parameters e.g. pH and temperature of the reaction medium, concentration of multifunctional agent and precursor metal ions play important role in the biogenic synthesis of Au NPs with finely tuned structures.

Keywords: green multifunctional agent, metal nanoparticles, biogenic synthesis

Procedia PDF Downloads 402
24846 A Synthetic Strategy to Attach 2,6-Dichlorophenolindophenol onto Multi Walled Carbon Nanotubes and Their Application for Electrocatalytic Determination of Sulfide

Authors: Alireza Mohadesi, Ashraf Salmanipour

Abstract:

A chemically modified glassy carbon electrode for electrocatalytic determination of sulfide was developed using multiwalled carbon nanotubes (MWCNTs) covalently immobilized with 2,6-dichlorophenolindophenol (DPIP). The immobilization of 2,6-dichlorophenolindophenol with MWCNTs was performed with a new synthetic strategy and characterized by UV–visible absorption spectroscopy, Fourier transform infrared spectroscopy and cyclic voltammetry. The cyclic voltammetric response of DPIP grafted onto MWCNTs indicated that it promotes the low potential, sensitive and stable determination of sulfide. The dependence of response currents on the concentration of sulfide was examined and was linear in the range of 10 - 1100 µM. The detection limit of sulfide was 5 µM and RSD for 100 and 500 µM sulfides were 1.8 and 1.3 %. Many interfering species had little or no effect on the determination of sulfide. The procedure was applied to determination of sulfide in waters samples.

Keywords: functionalized carbon nanotubes, sulfide, biological samples, 2, 6-dichlorophenolindophenol

Procedia PDF Downloads 264
24845 An Analysis of Classification of Imbalanced Datasets by Using Synthetic Minority Over-Sampling Technique

Authors: Ghada A. Alfattni

Abstract:

Analysing unbalanced datasets is one of the challenges that practitioners in machine learning field face. However, many researches have been carried out to determine the effectiveness of the use of the synthetic minority over-sampling technique (SMOTE) to address this issue. The aim of this study was therefore to compare the effectiveness of the SMOTE over different models on unbalanced datasets. Three classification models (Logistic Regression, Support Vector Machine and Nearest Neighbour) were tested with multiple datasets, then the same datasets were oversampled by using SMOTE and applied again to the three models to compare the differences in the performances. Results of experiments show that the highest number of nearest neighbours gives lower values of error rates. 

Keywords: imbalanced datasets, SMOTE, machine learning, logistic regression, support vector machine, nearest neighbour

Procedia PDF Downloads 315
24844 Seismic Analysis of Structurally Hybrid Wind Mill Tower

Authors: Atul K. Desai, Hemal J. Shah

Abstract:

The tall windmill towers are designed as monopole tower or lattice tower. In the present research, a 125-meter high hybrid tower which is a combination of lattice and monopole type is proposed. The response of hybrid tower is compared with conventional monopole tower. The towers were analyzed in finite element method software considering nonlinear seismic time history load. The synthetic seismic time history for different soil is derived using the SeismoARTIF software. From the present research, it is concluded that, in the hybrid tower, we are not getting resonance condition. The base shear is less in hybrid tower compared to monopole tower for different soil conditions.

Keywords: dynamic analysis, hybrid wind mill tower, resonance condition, synthetic time history

Procedia PDF Downloads 119
24843 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 240
24842 Inversion of Gravity Data for Density Reconstruction

Authors: Arka Roy, Chandra Prakash Dubey

Abstract:

Inverse problem generally used for recovering hidden information from outside available data. Vertical component of gravity field we will be going to use for underneath density structure calculation. Ill-posing nature is main obstacle for any inverse problem. Linear regularization using Tikhonov formulation are used for appropriate choice of SVD and GSVD components. For real time data handle, signal to noise ratios should have to be less for reliable solution. In our study, 2D and 3D synthetic model with rectangular grid are used for gravity field calculation and its corresponding inversion for density reconstruction. Fine grid also we have considered to hold any irregular structure. Keeping in mind of algebraic ambiguity factor number of observation point should be more than that of number of data point. Picard plot is represented here for choosing appropriate or main controlling Eigenvalues for a regularized solution. Another important study is depth resolution plot (DRP). DRP are generally used for studying how the inversion is influenced by regularizing or discretizing. Our further study involves real time gravity data inversion of Vredeforte Dome South Africa. We apply our method to this data. The results include density structure is in good agreement with known formation in that region, which puts an additional support of our method.

Keywords: depth resolution plot, gravity inversion, Picard plot, SVD, Tikhonov formulation

Procedia PDF Downloads 181
24841 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby

Authors: Jazim Sohail, Filipe Teixeira-Dias

Abstract:

Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.

Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI

Procedia PDF Downloads 173
24840 Review of Comparison of Subgrade Soil Stabilised with Natural, Synthetic, and Waste Fibers

Authors: Jacqueline Michella Anak Nathen

Abstract:

Subgrade soil is an essential component in the design of road structures as it provides lateral support to the pavement. One of the main reasons for the failure of the pavement is the settlement of the subgrade and the high susceptibility to moisture, which leads to a loss of strength of the subgrade. Construction over weak or soft subgrade affects the performance of the pavement and causes instability of the pavement. If the mechanical properties of the subgrade soils are lower than those required, the soil stabilisation method can be an option to improve the soil properties of the weak subgrade. Soil stabilisation is one of the most popular techniques for improving poor subgrade soils, resulting in a significant improvement in the subgrade soil’s tensile strength, shear strength, and bearing capacity. Soil stabilisation encompasses the various methods used to alter the properties of soil to improve its engineering properties. Soil stabilisation can be broadly divided into four types: thermal, electrical, mechanical, and chemical. The most common method of improving the physical and mechanical properties of soils is stabilisation using binders such as cement and lime. However, soil stabilisation with conventional methods using cement and lime has become uneconomical in recent years, so there is a need to look for an alternative, such as fiber. Although not a new technique, adding fiber is a very practical alternative to soil stabilisation. Various types of fibers, such as natural, synthetic, and waste fibers, have been used as stabilising agents to improve the strength and durability of subgrade soils. This review provides a comprehensive comparison of the effectiveness of natural, synthetic, and waste fibers in stabilising subgrade soils.

Keywords: subgrade, soil stabilisation, pavement, fiber, stabiliser

Procedia PDF Downloads 57
24839 Estimating the Ladder Angle and the Camera Position From a 2D Photograph Based on Applications of Projective Geometry and Matrix Analysis

Authors: Inigo Beckett

Abstract:

In forensic investigations, it is often the case that the most potentially useful recorded evidence derives from coincidental imagery, recorded immediately before or during an incident, and that during the incident (e.g. a ‘failure’ or fire event), the evidence is changed or destroyed. To an image analysis expert involved in photogrammetric analysis for Civil or Criminal Proceedings, traditional computer vision methods involving calibrated cameras is often not appropriate because image metadata cannot be relied upon. This paper presents an approach for resolving this problem, considering in particular and by way of a case study, the angle of a simple ladder shown in a photograph. The UK Health and Safety Executive (HSE) guidance document published in 2014 (INDG455) advises that a leaning ladder should be erected at 75 degrees to the horizontal axis. Personal injury cases can arise in the construction industry because a ladder is too steep or too shallow. Ad-hoc photographs of such ladders in their incident position provide a basis for analysis of their angle. This paper presents a direct approach for ascertaining the position of the camera and the angle of the ladder simultaneously from the photograph(s) by way of a workflow that encompasses a novel application of projective geometry and matrix analysis. Mathematical analysis shows that for a given pixel ratio of directly measured collinear points (i.e. features that lie on the same line segment) from the 2D digital photograph with respect to a given viewing point, we can constrain the 3D camera position to a surface of a sphere in the scene. Depending on what we know about the ladder, we can enforce another independent constraint on the possible camera positions which enables us to constrain the possible positions even further. Experiments were conducted using synthetic and real-world data. The synthetic data modeled a vertical plane with a ladder on a horizontally flat plane resting against a vertical wall. The real-world data was captured using an Apple iPhone 13 Pro and 3D laser scan survey data whereby a ladder was placed in a known location and angle to the vertical axis. For each case, we calculated camera positions and the ladder angles using this method and cross-compared them against their respective ‘true’ values.

Keywords: image analysis, projective geometry, homography, photogrammetry, ladders, Forensics, Mathematical modeling, planar geometry, matrix analysis, collinear, cameras, photographs

Procedia PDF Downloads 17
24838 Convolutional Neural Networks-Optimized Text Recognition with Binary Embeddings for Arabic Expiry Date Recognition

Authors: Mohamed Lotfy, Ghada Soliman

Abstract:

Recognizing Arabic dot-matrix digits is a challenging problem due to the unique characteristics of dot-matrix fonts, such as irregular dot spacing and varying dot sizes. This paper presents an approach for recognizing Arabic digits printed in dot matrix format. The proposed model is based on Convolutional Neural Networks (CNN) that take the dot matrix as input and generate embeddings that are rounded to generate binary representations of the digits. The binary embeddings are then used to perform Optical Character Recognition (OCR) on the digit images. To overcome the challenge of the limited availability of dotted Arabic expiration date images, we developed a True Type Font (TTF) for generating synthetic images of Arabic dot-matrix characters. The model was trained on a synthetic dataset of 3287 images and 658 synthetic images for testing, representing realistic expiration dates from 2019 to 2027 in the format of yyyy/mm/dd. Our model achieved an accuracy of 98.94% on the expiry date recognition with Arabic dot matrix format using fewer parameters and less computational resources than traditional CNN-based models. By investigating and presenting our findings comprehensively, we aim to contribute substantially to the field of OCR and pave the way for advancements in Arabic dot-matrix character recognition. Our proposed approach is not limited to Arabic dot matrix digit recognition but can also be extended to text recognition tasks, such as text classification and sentiment analysis.

Keywords: computer vision, pattern recognition, optical character recognition, deep learning

Procedia PDF Downloads 50
24837 Developing Value Chain of Synthetic Methane for Net-zero Carbon City Gas Supply in Japan

Authors: Ryota Kuzuki, Mitsuhiro Kohara, Noboru Kizuki, Satoshi Yoshida, Hidetaka Hirai, Yuta Nezasa

Abstract:

About fifty years have passed since Japan's gas supply industry became the first in the world to switch from coal and oil to LNG as a city gas feedstock. Since the Japanese government target of net-zero carbon emission in 2050 was announced in October 2020, it has now entered a new era of challenges to commit to the requirement for decarbonization. This paper describes the situation that synthetic methane, produced from renewable energy-derived hydrogen and recycled carbon, is a promising national policy of transition toward net-zero society. In November 2020, the Japan Gas Association announced the 'Carbon Neutral Challenge 2050' as a vision to contribute to the decarbonization of society by converting the city gas supply to carbon neutral. The key technologies is methanation. This paper shows that methanation is a realistic solution to contribute to the decarbonization of the whole country at a lower social cost, utilizing the supply chain that already exists, from LNG plants to burner chips. The challenges during the transition period (2030-2050), as CO2 captured from exhaust of thermal power plants and industrial factories are expected to be used, it is proposed that a system of guarantee of origin (GO) for H2 and CO2 should be established and harmonize international rules for calculating and allocating greenhouse gas emissions in the supply chain, a platform is also needed to manage tracking information on certified environmental values.

Keywords: synthetic methane, recycled carbon fuels, methanation, transition period, environmental value transfer platform

Procedia PDF Downloads 80
24836 Synthetic Classicism: A Machine Learning Approach to the Recognition and Design of Circular Pavilions

Authors: Federico Garrido, Mostafa El Hayani, Ahmed Shams

Abstract:

The exploration of the potential of artificial intelligence (AI) in architecture is still embryonic, however, its latent capacity to change design disciplines is significant. 'Synthetic Classism' is a research project that questions the underlying aspects of classically organized architecture not just in aesthetic terms but also from a geometrical and morphological point of view, intending to generate new architectural information using historical examples as source material. The main aim of this paper is to explore the uses of artificial intelligence and machine learning algorithms in architectural design while creating a coherent narrative to be contained within a design process. The purpose is twofold: on one hand, to develop and train machine learning algorithms to produce architectural information of small pavilions and on the other, to synthesize new information from previous architectural drawings. These algorithms intend to 'interpret' graphical information from each pavilion and then generate new information from it. The procedure, once these algorithms are trained, is the following: parting from a line profile, a synthetic 'front view' of a pavilion is generated, then using it as a source material, an isometric view is created from it, and finally, a top view is produced. Thanks to GAN algorithms, it is also possible to generate Front and Isometric views without any graphical input as well. The final intention of the research is to produce isometric views out of historical information, such as the pavilions from Sebastiano Serlio, James Gibbs, or John Soane. The idea is to create and interpret new information not just in terms of historical reconstruction but also to explore AI as a novel tool in the narrative of a creative design process. This research also challenges the idea of the role of algorithmic design associated with efficiency or fitness while embracing the possibility of a creative collaboration between artificial intelligence and a human designer. Hence the double feature of this research, both analytical and creative, first by synthesizing images based on a given dataset and then by generating new architectural information from historical references. We find that the possibility of creatively understand and manipulate historic (and synthetic) information will be a key feature in future innovative design processes. Finally, the main question that we propose is whether an AI could be used not just to create an original and innovative group of simple buildings but also to explore the possibility of fostering a novel architectural sensibility grounded on the specificities on the architectural dataset, either historic, human-made or synthetic.

Keywords: architecture, central pavilions, classicism, machine learning

Procedia PDF Downloads 113
24835 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data

Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan

Abstract:

Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.

Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data

Procedia PDF Downloads 413
24834 Bio Composites for Substituting Synthetic Packaging Materials

Authors: Menonjyoti Kalita, Pradip Baishya

Abstract:

In recent times, the world has been facing serious environmental concerns and issues, such as sustainability and cost, due to the overproduction of synthetic materials and their participation in degrading the environment by means of industrial waste and non-biodegradable characteristics. As such, biocomposites come in handy to ease such troubles. Bio-based composites are promising materials for future applications for substituting synthetic packaging materials. The challenge of making packaging materials lighter, safer and cheaper leads to investigating advanced materials with desired properties. Also, awareness of environmental issues forces researchers and manufacturers to spend effort on composite and bio-composite materials fields. This paper explores and tests some nature-friendly materials has been done which can replace low-density plastics. The materials selected included sugarcane bagasse, areca palm, and bamboo leaves. Sugarcane bagasse bamboo leaves and areca palm sheath are the primary material or natural fibre for testing. These products were processed, and the tensile strength of the processed parts was tested in Micro UTM; it was found that areca palm can be used as a good building material in replacement to polypropylene and even could be used in the production of furniture with the help of epoxy resin. And for bamboo leaves, it was found that bamboo and cotton, when blended in a 50:50 ratio, it has great tensile strength. For areca, it was found that areca fibres can be a good substitute for polypropylene, which can be used in building construction as binding material and also other products.

Keywords: biodegradable characteristics, bio-composites, areca palm sheath, polypropylene, micro UTM

Procedia PDF Downloads 63
24833 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images

Authors: A. Nachour, L. Ouzizi, Y. Aoura

Abstract:

Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.

Keywords: edge detection, medical MRImages, multi-agent systems, vector field convolution

Procedia PDF Downloads 360
24832 Topology-Based Character Recognition Method for Coin Date Detection

Authors: Xingyu Pan, Laure Tougne

Abstract:

For recognizing coins, the graved release date is important information to identify precisely its monetary type. However, reading characters in coins meets much more obstacles than traditional character recognition tasks in the other fields, such as reading scanned documents or license plates. To address this challenging issue in a numismatic context, we propose a training-free approach dedicated to detection and recognition of the release date of the coin. In the first step, the date zone is detected by comparing histogram features; in the second step, a topology-based algorithm is introduced to recognize coin numbers with various font types represented by binary gradient map. Our method obtained a recognition rate of 92% on synthetic data and of 44% on real noised data.

Keywords: coin, detection, character recognition, topology

Procedia PDF Downloads 225
24831 Natural and Synthetic Antioxidant in Beef Meatball

Authors: Abul Hashem

Abstract:

The experiment was conducted to find out the effect of different levels of Moringa oleifiera leaf extract and synthetic antioxidant (Beta Hydroxyl Anisole) on fresh and preserved beef meatballs. For this purpose, ground beef samples were divided into five treatment groups. They are treated as control, synthetic antioxidant, 0.1%, 0.2% and 0.3% Moringa oleifera leaf extract as T1, T2, T3, T4 and T5, respectively. Five kinds of meatballs were made and biscuit crushed and egg albumin was mixed with beef meatballs and cooking was practiced properly. Proximate analysis, sensory tests (color, flavor, tenderness, juiciness, overall acceptability), cooking loss, pH value, free fatty acids (FFA), thiobarbituric acid values (TBARS), peroxide value(POV) and microbiological examination were determined in order to evaluate the effect of Moringa oleifiera leaf extract as natural antioxidant & antimicrobial activities in comparing to BHA (Beta Hydroxyl Anisole) at first day before freezing and for maintaining meatballs qualities on the shelf life of beef meat balls stored for 60 days under frozen condition. Freezing temperature was -20˚C. Days of intervals of experiment were on 0, 15th, 30th, and 60th days. Dry matter content of all the treatment groups differ significantly (p<0.05). On the contrary, DM content increased significantly (p<0.05) with the advancement of different days of intervals. CP content of all the treatments were increased significantly (p<0.05) among the different treatment groups. EE content at different treatment levels differ significantly (p<0.05). Ash content at different treatment levels was also differ significantly (p<0.05). FFA values, TBARS, POV were decreased significantly (p<0.05) at different treatment levels. Color, odor, tenderness, juiciness, overall acceptability, raw PH, cooked pH were increased at different treatment levels significantly (p<0.05). The cooking loss (%) at different treatment levels were differ significantly (p<0.05). TVC (logCFU/g), TCC (logCFU/g) and TYMC (logCFU/g) was decreased significantly (p<0.05) at different treatment levels comparison to control. Considering CP, tenderness, juiciness, overall acceptability, cooking loss, FFA, POV, TBARS and microbial parameters it can be concluded that Moringa oleifera leaf extract at 0.1%, 0.2% and 0.3% can be used instead of 0.1% synthetic antioxidant BHA in beef meatballs.

Keywords: antioxidant, beef meatball, BHA, moringa leaf extract, quality

Procedia PDF Downloads 276
24830 Generation of High-Quality Synthetic CT Images from Cone Beam CT Images Using A.I. Based Generative Networks

Authors: Heeba A. Gurku

Abstract:

Introduction: Cone Beam CT(CBCT) images play an integral part in proper patient positioning in cancer patients undergoing radiation therapy treatment. But these images are low in quality. The purpose of this study is to generate high-quality synthetic CT images from CBCT using generative models. Material and Methods: This study utilized two datasets from The Cancer Imaging Archive (TCIA) 1) Lung cancer dataset of 20 patients (with full view CBCT images) and 2) Pancreatic cancer dataset of 40 patients (only 27 patients having limited view images were included in the study). Cycle Generative Adversarial Networks (GAN) and its variant Attention Guided Generative Adversarial Networks (AGGAN) models were used to generate the synthetic CTs. Models were evaluated by visual evaluation and on four metrics, Structural Similarity Index Measure (SSIM), Peak Signal Noise Ratio (PSNR) Mean Absolute Error (MAE) and Root Mean Square Error (RMSE), to compare the synthetic CT and original CT images. Results: For pancreatic dataset with limited view CBCT images, our study showed that in Cycle GAN model, MAE, RMSE, PSNR improved from 12.57to 8.49, 20.94 to 15.29 and 21.85 to 24.63, respectively but structural similarity only marginally increased from 0.78 to 0.79. Similar, results were achieved with AGGAN with no improvement over Cycle GAN. However, for lung dataset with full view CBCT images Cycle GAN was able to reduce MAE significantly from 89.44 to 15.11 and AGGAN was able to reduce it to 19.77. Similarly, RMSE was also decreased from 92.68 to 23.50 in Cycle GAN and to 29.02 in AGGAN. SSIM and PSNR also improved significantly from 0.17 to 0.59 and from 8.81 to 21.06 in Cycle GAN respectively while in AGGAN SSIM increased to 0.52 and PSNR increased to 19.31. In both datasets, GAN models were able to reduce artifacts, reduce noise, have better resolution, and better contrast enhancement. Conclusion and Recommendation: Both Cycle GAN and AGGAN were significantly able to reduce MAE, RMSE and PSNR in both datasets. However, full view lung dataset showed more improvement in SSIM and image quality than limited view pancreatic dataset.

Keywords: CT images, CBCT images, cycle GAN, AGGAN

Procedia PDF Downloads 55
24829 Investigation Particle Behavior in Gas-Solid Filtration with Electrostatic Discharge in a Hybrid System

Authors: Flávia M. Oliveira, Marcos V. Rodrigues, Mônica L. Aguiar

Abstract:

Synthetic fibers are widely used in gas filtration. Previous attempts to optimize the filtration process have employed mixed fibers as the filter medium in gas-solid separation. Some of the materials most frequently used this purpose are composed of polyester, polypropylene, and glass fibers. In order to improve the retention of cement particles in bag filters, the present study investigates the use of synthetic glass fiber filters and polypropylene fiber for particle filtration, with electrostatic discharge of 0 to -2 kV in cement particles. The filtration curves obtained showed that charging increased the particle collection efficiency and lowered the pressure drop. Particle diameter had a direct influence on the formation of the dust cake, and the application of electrostatic discharge to the particles resulted in the retention of more particles, hence increasing the lifetime of fabric filters.

Keywords: glass fiber filter, particle, electrostatic discharge, cement

Procedia PDF Downloads 357
24828 A Geosynchronous Orbit Synthetic Aperture Radar Simulator for Moving Ship Targets

Authors: Linjie Zhang, Baifen Ren, Xi Zhang, Genwang Liu

Abstract:

Ship detection is of great significance for both military and civilian applications. Synthetic aperture radar (SAR) with all-day, all-weather, ultra-long-range characteristics, has been used widely. In view of the low time resolution of low orbit SAR and the needs for high time resolution SAR data, GEO (Geosynchronous orbit) SAR is getting more and more attention. Since GEO SAR has short revisiting period and large coverage area, it is expected to be well utilized in marine ship targets monitoring. However, the height of the orbit increases the time of integration by almost two orders of magnitude. For moving marine vessels, the utility and efficacy of GEO SAR are still not sure. This paper attempts to find the feasibility of GEO SAR by giving a GEO SAR simulator of moving ships. This presented GEO SAR simulator is a kind of geometrical-based radar imaging simulator, which focus on geometrical quality rather than high radiometric. Inputs of this simulator are 3D ship model (.obj format, produced by most 3D design software, such as 3D Max), ship's velocity, and the parameters of satellite orbit and SAR platform. Its outputs are simulated GEO SAR raw signal data and SAR image. This simulating process is accomplished by the following four steps. (1) Reading 3D model, including the ship rotations (pitch, yaw, and roll) and velocity (speed and direction) parameters, extract information of those little primitives (triangles) which is visible from the SAR platform. (2) Computing the radar scattering from the ship with physical optics (PO) method. In this step, the vessel is sliced into many little rectangles primitives along the azimuth. The radiometric calculation of each primitive is carried out separately. Since this simulator only focuses on the complex structure of ships, only single-bounce reflection and double-bounce reflection are considered. (3) Generating the raw data with GEO SAR signal modeling. Since the normal ‘stop and go’ model is not available for GEO SAR, the range model should be reconsidered. (4) At last, generating GEO SAR image with improved Range Doppler method. Numerical simulation of fishing boat and cargo ship will be given. GEO SAR images of different posture, velocity, satellite orbit, and SAR platform will be simulated. By analyzing these simulated results, the effectiveness of GEO SAR for the detection of marine moving vessels is evaluated.

Keywords: GEO SAR, radar, simulation, ship

Procedia PDF Downloads 137
24827 Kantian Epistemology in Examination of the Axiomatic Principles of Economics: The Synthetic a Priori in the Economic Structure of Society

Authors: Mirza Adil Ahmad Mughal

Abstract:

Transcendental analytics, in the critique of pure reason, combines space and time as conditions of the possibility of the phenomenon from the transcendental aesthetic with the pure magnitude-intuition notion. The property of continuity as a qualitative result of the additive magnitude brings the possibility of connecting with experience, even though only as a potential because of the a priori necessity from assumption, as syntheticity of the a priori task of a scientific method of philosophy given by Kant, which precludes the application of categories to something not empirically reducible to the content of such a category's corresponding and possible object. This continuity as the qualitative result of a priori constructed notion of magnitude lies as a fundamental assumption and property of, what in Microeconomic theory is called as, 'choice rules' which combine the potentially-empirical and practical budget-price pairs with preference relations. This latter result is the purest qualitative side of the choice rules', otherwise autonomously, quantitative nature. The theoretical, barring the empirical, nature of this qualitative result is a synthetic a priori truth, which, if at all, it should be, if the axiomatic structure of the economic theory is held to be correct. It has a potentially verifiable content as its possible object in the form of quantitative price-budget pairs. Yet, the object that serves the respective Kantian category is qualitative itself, which is utility. This article explores the validity of Kantian qualifications for this application of 'categories' to the economic structure of society.

Keywords: categories of understanding, continuity, convexity, psyche, revealed preferences, synthetic a priori

Procedia PDF Downloads 65
24826 Radio-Frequency Technologies for Sensing and Imaging

Authors: Cam Nguyen

Abstract:

Rapid, accurate, and safe sensing and imaging of physical quantities or structures finds many applications and is of significant interest to society. Sensing and imaging using radio-frequency (RF) techniques, particularly, has gone through significant development and subsequently established itself as a unique territory in the sensing world. RF sensing and imaging has played a critical role in providing us many sensing and imaging abilities beyond our human capabilities, benefiting both civilian and military applications - for example, from sensing abnormal conditions underneath some structures’ surfaces to detection and classification of concealed items, hidden activities, and buried objects. We present the developments of several sensing and imaging systems implementing RF technologies like ultra-wide band (UWB), synthetic-pulse, and interferometry. These systems are fabricated completely using RF integrated circuits. The UWB impulse system operates over multiple pulse durations from 450 to 1170 ps with 5.5-GHz RF bandwidth. It performs well through tests of various samples, demonstrating its usefulness for subsurface sensing. The synthetic-pulse system operating from 0.6 to 5.6 GHz can assess accurately subsurface structures. The synthetic-pulse system operating from 29.72-37.7 GHz demonstrates abilities for various surface and near-surface sensing such as profile mapping, liquid-level monitoring, and anti-personnel mine locating. The interferometric system operating at 35.6 GHz demonstrates its multi-functional capability for measurement of displacements and slow velocities. These RF sensors are attractive and useful for various surface and subsurface sensing applications. This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors.

Keywords: RF sensors, radars, surface sensing, subsurface sensing

Procedia PDF Downloads 289
24825 A Method for Reduction of Association Rules in Data Mining

Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa

Abstract:

The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.

Keywords: data mining, association rules, rules reduction, artificial intelligence

Procedia PDF Downloads 130
24824 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images

Authors: Mehrnoosh Omati, Mahmod Reza Sahebi

Abstract:

The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.

Keywords: coupled Markov random field (MRF), environment, object-based analysis, polarimetric SAR (PolSAR) images

Procedia PDF Downloads 194
24823 Removal of Aromatic Fractions of Natural Organic Matter from Synthetic Water Using Aluminium Based Electrocoagulation

Authors: Tanwi Priya, Brijesh Kumar Mishra

Abstract:

Occurrence of aromatic fractions of Natural Organic Matter (NOM) led to formation of carcinogenic disinfection by products such as trihalomethanes in chlorinated water. In the present study, the efficiency of aluminium based electrocoagulation on the removal of prominent aromatic groups such as phenol, hydrophobic auxochromes, and carboxyl groups from NOM enriched synthetic water has been evaluated using various spectral indices. The effect of electrocoagulation on turbidity has also been discussed. The variation in coagulation performance as a function of pH has been studied. Our result suggests that electrocoagulation can be considered as appropriate remediation approach to reduce trihalomethanes formation in water. It has effectively reduced hydrophobic fractions from NOM enriched low turbid water. The charge neutralization and enmeshment of dispersed colloidal particles inside metallic hydroxides is the possible mechanistic approach in electrocoagulation.

Keywords: aromatic fractions, electrocoagulation, natural organic matter, spectral indices

Procedia PDF Downloads 246
24822 Use of Acid Mine Drainage as a Source of Iron to Initiate the Solar Photo-Fenton Treatment of Municipal Wastewater: Circular Economy Effect

Authors: Tooba Aslam, Efthalia Chatzisymeon

Abstract:

Untreated Municipal Wastewater (MWW) is renowned as the utmost harmful pollution caused to environmental water due to the high presence of nutrients and organic contaminants. Removal of Chemical Oxygen Demand (COD) from synthetic as well as municipal wastewater is investigated by using acid mine drainage as a source of iron to initiate the solar photo-Fenton treatment of municipal wastewater. In this study, Acid Mine Drainage (AMD) and different minerals enriched in iron, such as goethite, hematite, magnetite, and magnesite, have been used as the source of iron to initiate the photo-Fenton process. Co-treatment of real municipal wastewater and acid mine drainage /minerals is widely examined. The effects of different parameters such as minerals recovery from AMD, AMD as a source of iron, H₂O₂ concentration, and COD concentrations on the COD percentage removal of the process are studied. The results show that, out of all the four minerals, only hematite (1g/L) could remove 30% of the pollutants at about 100 minutes and 1000 ppm of H₂O₂. The addition of AMD as a source of iron is performed and compared with both synthetic as well as real wastewater from South Africa under the same conditions, i.e., 1000 ppm of H₂O₂, ambient temperature, 2.8 pH, and solar simulator. In the case of synthetic wastewater, the maximum removal (56%) is achieved with 50 ppm of iron (AMD source) at 160 minutes. On the other hand, in real wastewater, the removal efficiency is 99% with 30 ppm of iron at 90 minutes and 96% with 50 ppm of iron at 120 minutes. In conclusion, overall, the co-treatment of AMD and MWW by solar photo-Fenton treatment appears to be an effective and promising method to remove organic materials from Municipal wastewater.

Keywords: municipal wastewater treatment, acid mine drainage, co-treatment, COD removal, solar photo-Fenton, circular economy

Procedia PDF Downloads 59