Search results for: feature extraction multispectral
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3304

Search results for: feature extraction multispectral

2644 A Technique for Image Segmentation Using K-Means Clustering Classification

Authors: Sadia Basar, Naila Habib, Awais Adnan

Abstract:

The paper presents the Technique for Image Segmentation Using K-Means Clustering Classification. The presented algorithms were specific, however, missed the neighboring information and required high-speed computerized machines to run the segmentation algorithms. Clustering is the process of partitioning a group of data points into a small number of clusters. The proposed method is content-aware and feature extraction method which is able to run on low-end computerized machines, simple algorithm, required low-quality streaming, efficient and used for security purpose. It has the capability to highlight the boundary and the object. At first, the user enters the data in the representation of the input. Then in the next step, the digital image is converted into groups clusters. Clusters are divided into many regions. The same categories with same features of clusters are assembled within a group and different clusters are placed in other groups. Finally, the clusters are combined with respect to similar features and then represented in the form of segments. The clustered image depicts the clear representation of the digital image in order to highlight the regions and boundaries of the image. At last, the final image is presented in the form of segments. All colors of the image are separated in clusters.

Keywords: clustering, image segmentation, K-means function, local and global minimum, region

Procedia PDF Downloads 376
2643 Money Laundering and Governance in Cryptocurrencies: The Double-Edged Sword of Blockchain Technology

Authors: Jiaqi Yan, Yani Shi

Abstract:

With the growing popularity of bitcoin transactions, criminals have exploited the bitcoin like cryptocurrencies, and cybercriminals such as money laundering have thrived. Unlike traditional currencies, the Internet-based virtual currencies can be used anonymously via the blockchain technology underpinning. In this paper, we analyze the double-edged sword features of blockchain technology in the context of money laundering. In particular, the traceability feature of blockchain-based system facilitates a level of governance, while the decentralization feature of blockchain-based system may bring governing difficulties. Based on the analysis, we propose guidelines for policy makers in governing blockchain-based cryptocurrency systems.

Keywords: cryptocurrency, money laundering, blockchain, decentralization, traceability

Procedia PDF Downloads 202
2642 Effect of Ethanol Concentration and Enzyme Pre-Treatment on Bioactive Compounds from Ginger Extract

Authors: S. Lekhavat, T. Kajsongkram, S. Sang-han

Abstract:

Dried ginger was extracted and investigated the effect of ethanol concentration and enzyme pre-treatment on its bioactive compounds in solvent extraction process. Sliced fresh gingers were dried by oven dryer at 70 °C for 24 hours and ground to powder using grinder which their size were controlled by passing through a 20-mesh sieve. In enzyme pre-treatment process, ginger powder was sprayed with 1 % (w/w) cellulase and then was incubated at 45 °C for 2 hours following by extraction process using ethanol at concentration of 0, 20, 40, 60 and 80 % (v/v), respectively. The ratio of ginger powder and ethanol are 1:9 and extracting conditions were controlled at 80 °C for 2 hours. Bioactive compounds extracted from ginger, either enzyme-treated or non enzyme-treated samples, such as total phenolic content (TPC), 6-Gingerol (6 G), 6-Shogaols (6 S) and antioxidant activity (IC50 using DPPH assay), were examined. Regardless of enzyme treatment, the results showed that 60 % ethanol provided the highest TPC (20.36 GAE mg /g. dried ginger), 6G (0.77%), 6S (0.036 %) and the lowest IC50 (625 μg/ml) compared to other ratios of ethanol. Considering the effect of enzyme on bioactive compounds and antioxidant activity, it was found that enzyme-treated sample has more 6G (0.17-0.77 %) and 6S (0.020-0.036 %) than non enzyme-treated samples (0.13-0.77 % 6G, 0.015-0.036 % 6S). However, the results showed that non enzyme-treated extracts provided higher TPC (6.76-20.36 GAE mg /g. dried ginger) and Lowest IC50 (625-1494 μg/ml ) than enzyme-treated extracts (TPC 5.36-17.50 GAE mg /g. dried ginger, IC50 793-2146 μg/ml).

Keywords: antioxidant activity, enzyme, extraction, ginger

Procedia PDF Downloads 256
2641 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: brain-machine interface, EEGLAB, emotiv EEG neuroheadset, OpenViBE, simulink

Procedia PDF Downloads 502
2640 From User's Requirements to UML Class Diagram

Authors: Zeineb Ben Azzouz, Wahiba Ben Abdessalem Karaa

Abstract:

The automated extraction of UML class diagram from natural language requirements is a highly challenging task. Many approaches, frameworks and tools have been presented in this field. Nonetheless, the experiments of these tools have shown that there is no approach that can work best all the time. In this context, we propose a new accurate approach to facilitate the automatic mapping from textual requirements to UML class diagram. Our new approach integrates the best properties of statistical Natural Language Processing (NLP) techniques to reduce ambiguity when analysing natural language requirements text. In addition, our approach follows the best practices defined by conceptual modelling experts to determine some patterns indispensable for the extraction of basic elements and concepts of the class diagram. Once the relevant information of class diagram is captured, a XMI document is generated and imported with a CASE tool to build the corresponding UML class diagram.

Keywords: class diagram, user’s requirements, XMI, software engineering

Procedia PDF Downloads 471
2639 Analysis of Vocal Fold Vibrations from High-Speed Digital Images Based on Dynamic Time Warping

Authors: A. I. A. Rahman, Sh-Hussain Salleh, K. Ahmad, K. Anuar

Abstract:

Analysis of vocal fold vibration is essential for understanding the mechanism of voice production and for improving clinical assessment of voice disorders. This paper presents a Dynamic Time Warping (DTW) based approach to analyze and objectively classify vocal fold vibration patterns. The proposed technique was designed and implemented on a Glottal Area Waveform (GAW) extracted from high-speed laryngeal images by delineating the glottal edges for each image frame. Feature extraction from the GAW was performed using Linear Predictive Coding (LPC). Several types of voice reference templates from simulations of clear, breathy, fry, pressed and hyperfunctional voice productions were used. The patterns of the reference templates were first verified using the analytical signal generated through Hilbert transformation of the GAW. Samples from normal speakers’ voice recordings were then used to evaluate and test the effectiveness of this approach. The classification of the voice patterns using the technique of LPC and DTW gave the accuracy of 81%.

Keywords: dynamic time warping, glottal area waveform, linear predictive coding, high-speed laryngeal images, Hilbert transform

Procedia PDF Downloads 239
2638 Development of Web-Based Iceberg Detection Using Deep Learning

Authors: A. Kavya Sri, K. Sai Vineela, R. Vanitha, S. Rohith

Abstract:

Large pieces of ice that break from the glaciers are known as icebergs. The threat that icebergs pose to navigation, production of offshore oil and gas services, and underwater pipelines makes their detection crucial. In this project, an automated iceberg tracking method using deep learning techniques and satellite images of icebergs is to be developed. With a temporal resolution of 12 days and a spatial resolution of 20 m, Sentinel-1 (SAR) images can be used to track iceberg drift over the Southern Ocean. In contrast to multispectral images, SAR images are used for analysis in meteorological conditions. This project develops a web-based graphical user interface to detect and track icebergs using sentinel-1 images. To track the movement of the icebergs by using temporal images based on their latitude and longitude values and by comparing the center and area of all detected icebergs. Testing the accuracy is done by precision and recall measures.

Keywords: synthetic aperture radar (SAR), icebergs, deep learning, spatial resolution, temporal resolution

Procedia PDF Downloads 91
2637 Magnetic Nano-Composite of Self-Doped Polyaniline Nanofibers for Magnetic Dispersive Micro Solid Phase Extraction Applications

Authors: Hatem I. Mokhtar, Randa A. Abd-El-Salam, Ghada M. Hadad

Abstract:

An improved nano-composite of self-doped polyaniline nanofibers and silica-coated magnetite nanoparticles were prepared and evaluated for suitability to magnetic dispersive micro solid-phase extraction. The work focused on optimization of the composite capacity to extract four fluoroquinolones (FQs) antibiotics, ciprofloxacin, enrofloxacin, danofloxacin, and difloxacin from water and improvement of composite stability towards acid and atmospheric degradation. Self-doped polyaniline nanofibers were prepared by oxidative co-polymerization of aniline with anthranilic acid. Magnetite nanopariticles were prepared by alkaline co-precipitation and coated with silica by silicate hydrolysis on magnetite nanoparticles surface at pH 6.5. The composite was formed by self-assembly by mixing self-doped polyaniline nanofibers with silica-coated magnetite nanoparticles dispersions in ethanol. The composite structure was confirmed by transmission electron microscopy (TEM). Self-doped polyaniline nanofibers and magnetite chemical structures were confirmed by FT-IR while silica coating of the magnetite was confirmed by Energy Dispersion X-ray Spectroscopy (EDS). Improved stability of the composite magnetic component was evidenced by resistance to degrade in 2N HCl solution. The adsorption capacity of self-doped polyaniline nanofibers based composite was higher than previously reported corresponding composite prepared from polyaniline nanofibers instead of self-doped polyaniline nanofibers. Adsorption-pH profile for the studied FQs on the prepared composite revealed that the best pH for adsorption was in range of 6.5 to 7. Best extraction recovery values were obtained at pH 7 using phosphate buffer. The best solvent for FQs desorption was found to be 0.1N HCl in methanol:water (8:2; v/v) mixture. 20 mL of Spiked water sample with studied FQs were preconcentrated using 4.8 mg of composite and resulting extracts were analysed by HPLC-UV method. The prepared composite represented a suitable adsorbent phase for magnetic dispersive micro-solid phase application.

Keywords: fluoroquinolones, magnetic dispersive micro extraction, nano-composite, self-doped polyaniline nanofibers

Procedia PDF Downloads 122
2636 Application of Liquid Emulsion Membrane Technique for the Removal of Cadmium(II) from Aqueous Solutions Using Aliquat 336 as a Carrier

Authors: B. Medjahed, M. A. Didi, B. Guezzen

Abstract:

In the present work, emulsion liquid membrane (ELM) technique was applied for the extraction of cadmium(II) present in aqueous samples. Aliquat 336 (Chloride tri-N-octylmethylammonium) was used as carrier to extract cadmium(II). The main objective of this work is to investigate the influence of various parameters affected the ELM formation and its stability and testing the performance of the prepared ELM on removal of cadmium by using synthetic solution with different concentrations. Experiments were conducted to optimize pH of the feed solution and it was found that cadmium(II) can be extracted at pH 6.5. The influence of the carrier concentration and treat ratio on the extraction process was investigated. The obtained results showed that the optimal values are respectively 3% (Aliquat 336) and a ratio (feed: emulsion) equal to 1:1.

Keywords: cadmium, carrier, emulsion liquid membrane, surfactant

Procedia PDF Downloads 406
2635 Using Scale Invariant Feature Transform Features to Recognize Characters in Natural Scene Images

Authors: Belaynesh Chekol, Numan Çelebi

Abstract:

The main purpose of this work is to recognize individual characters extracted from natural scene images using scale invariant feature transform (SIFT) features as an input to K-nearest neighbor (KNN); a classification learner algorithm. For this task, 1,068 and 78 images of English alphabet characters taken from Chars74k data set is used to train and test the classifier respectively. For each character image, We have generated describing features by using SIFT algorithm. This set of features is fed to the learner so that it can recognize and label new images of English characters. Two types of KNN (fine KNN and weighted KNN) were trained and the resulted classification accuracy is 56.9% and 56.5% respectively. The training time taken was the same for both fine and weighted KNN.

Keywords: character recognition, KNN, natural scene image, SIFT

Procedia PDF Downloads 281
2634 Iterative Panel RC Extraction for Capacitive Touchscreen

Authors: Chae Hoon Park, Jong Kang Park, Jong Tae Kim

Abstract:

Electrical characteristics of capacitive touchscreen need to be accurately analyzed to result in better performance for multi-channel capacitance sensing. In this paper, we extracted the panel resistances and capacitances of the touchscreen by comparing measurement data and model data. By employing a lumped RC model for driver-to-receiver paths in touchscreen, we estimated resistance and capacitance values according to the physical lengths of channel paths which are proportional to the RC model. As a result, we obtained the model having 95.54% accuracy of the measurement data.

Keywords: electrical characteristics of capacitive touchscreen, iterative extraction, lumped RC model, physical lengths of channel paths

Procedia PDF Downloads 334
2633 Application of Deep Learning and Ensemble Methods for Biomarker Discovery in Diabetic Nephropathy through Fibrosis and Propionate Metabolism Pathways

Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei

Abstract:

Diabetic nephropathy (DN) is a major complication of diabetes, with fibrosis and propionate metabolism playing critical roles in its progression. Identifying biomarkers linked to these pathways may provide novel insights into DN diagnosis and treatment. This study aims to identify biomarkers associated with fibrosis and propionate metabolism in DN. Analyze the biological pathways and regulatory mechanisms of these biomarkers. Develop a machine learning model to predict DN-related biomarkers and validate their functional roles. Publicly available transcriptome datasets related to DN (GSE96804 and GSE104948) were obtained from the GEO database (https://www.ncbi.nlm.nih.gov/gds), and 924 propionate metabolism-related genes (PMRGs) and 656 fibrosis-related genes (FRGs) were identified. The analysis began with the extraction of DN-differentially expressed genes (DN-DEGs) and propionate metabolism-related DEGs (PM-DEGs), followed by the intersection of these with fibrosis-related genes to identify key intersected genes. Instead of relying on traditional models, we employed a combination of deep neural networks (DNNs) and ensemble methods such as Gradient Boosting Machines (GBM) and XGBoost to enhance feature selection and biomarker discovery. Recursive feature elimination (RFE) was coupled with these advanced algorithms to refine the selection of the most critical biomarkers. Functional validation was conducted using convolutional neural networks (CNN) for gene set enrichment and immunoinfiltration analysis, revealing seven significant biomarkers—SLC37A4, ACOX2, GPD1, ACE2, SLC9A3, AGT, and PLG. These biomarkers are involved in critical biological processes such as fatty acid metabolism and glomerular development, providing a mechanistic link to DN progression. Furthermore, a TF–miRNA–mRNA regulatory network was constructed using natural language processing models to identify 8 transcription factors and 60 miRNAs that regulate these biomarkers, while a drug–gene interaction network revealed potential therapeutic targets such as UROKINASE–PLG and ATENOLOL–AGT. This integrative approach, leveraging deep learning and ensemble models, not only enhances the accuracy of biomarker discovery but also offers new perspectives on DN diagnosis and treatment, specifically targeting fibrosis and propionate metabolism pathways.

Keywords: diabetic nephropathy, deep neural networks, gradient boosting machines (GBM), XGBoost

Procedia PDF Downloads 9
2632 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment

Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto

Abstract:

Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.

Keywords: carbon stock, forest inventory, LiDAR, tree count

Procedia PDF Downloads 389
2631 Assisted Prediction of Hypertension Based on Heart Rate Variability and Improved Residual Networks

Authors: Yong Zhao, Jian He, Cheng Zhang

Abstract:

Cardiovascular diseases caused by hypertension are extremely threatening to human health, and early diagnosis of hypertension can save a large number of lives. Traditional hypertension detection methods require special equipment and are difficult to detect continuous blood pressure changes. In this regard, this paper first analyzes the principle of heart rate variability (HRV) and introduces sliding window and power spectral density (PSD) to analyze the time domain features and frequency domain features of HRV, and secondly, designs an HRV-based hypertension prediction network by combining Resnet, attention mechanism, and multilayer perceptron, which extracts the frequency domain through the improved ResNet18 features through a modified ResNet18, its fusion with time-domain features through an attention mechanism, and the auxiliary prediction of hypertension through a multilayer perceptron. Finally, the network was trained and tested using the publicly available SHAREE dataset on PhysioNet, and the test results showed that this network achieved 92.06% prediction accuracy for hypertension and outperformed K Near Neighbor(KNN), Bayes, Logistic, and traditional Convolutional Neural Network(CNN) models in prediction performance.

Keywords: feature extraction, heart rate variability, hypertension, residual networks

Procedia PDF Downloads 105
2630 A Survey of Feature-Based Steganalysis for JPEG Images

Authors: Syeda Mainaaz Unnisa, Deepa Suresh

Abstract:

Due to the increase in usage of public domain channels, such as the internet, and communication technology, there is a concern about the protection of intellectual property and security threats. This interest has led to growth in researching and implementing techniques for information hiding. Steganography is the art and science of hiding information in a private manner such that its existence cannot be recognized. Communication using steganographic techniques makes not only the secret message but also the presence of hidden communication, invisible. Steganalysis is the art of detecting the presence of this hidden communication. Parallel to steganography, steganalysis is also gaining prominence, since the detection of hidden messages can prevent catastrophic security incidents from occurring. Steganalysis can also be incredibly helpful in identifying and revealing holes with the current steganographic techniques, which makes them vulnerable to attacks. Through the formulation of new effective steganalysis methods, further research to improve the resistance of tested steganography techniques can be developed. Feature-based steganalysis method for JPEG images calculates the features of an image using the L1 norm of the difference between a stego image and the calibrated version of the image. This calibration can help retrieve some of the parameters of the cover image, revealing the variations between the cover and stego image and enabling a more accurate detection. Applying this method to various steganographic schemes, experimental results were compared and evaluated to derive conclusions and principles for more protected JPEG steganography.

Keywords: cover image, feature-based steganalysis, information hiding, steganalysis, steganography

Procedia PDF Downloads 216
2629 Separation of Mercury(Ii) from Petroleum Produced Water via Hollow Fiber Supported Liquid Membrane and Mass Transfer Modeling

Authors: Srestha Chaturabul, Wanchalerm Srirachat, Thanaporn Wannachod, Prakorn Ramakul, Ura Pancharoen, Soorathep Kheawhom

Abstract:

The separation of mercury(II) from petroleum-produced water from the Gulf of Thailand was carried out using a hollow fiber supported liquid membrane system (HFSLM). Optimum parameters for feed pretreatment were 0.2 M HCl, 4% (v/v) Aliquat 336 for extractant and 0.1 M thiourea for stripping solution. The best percentage obtained for extraction was 99.73% and for recovery 90.11%, respectively. The overall separation efficiency noted was 94.92% taking account of both extraction and recovery prospects. The model for this separation developed along a combined flux principle i.e. convection–diffusion–kinetic. The results showed excellent agreement with theoretical data at an average standard deviation of 1.5% and 1.8%, respectively.

Keywords: separation, mercury(ii), petroleum produced water, hollow fiber, liquid membrane

Procedia PDF Downloads 298
2628 Contactless Attendance System along with Temperature Monitoring

Authors: Nalini C. Iyer, Shraddha H., Anagha B. Varahamurthy, Dikshith C. S., Ishwar G. Kubasad, Vinayak I. Karalatti, Pavan B. Mulimani

Abstract:

The current scenario of the pandemic due to COVID-19 has led to the awareness among the people to avoid unneces-sary contact in public places. There is a need to avoid contact with physical objects to stop the spreading of infection. The contactless feature has to be included in the systems in public places wherever possible. For example, attendance monitoring systems with fingerprint biometric can be replaced with a contactless feature. One more important protocol followed in the current situation is temperature monitoring and screening. The paper describes an attendance system with a contactless feature and temperature screening for the university. The system displays a QR code to scan, which redirects to the student login web page only if the location is valid (the location where the student scans the QR code should be the location of the display of the QR code). Once the student logs in, the temperature of the student is scanned by the contactless temperature sensor (mlx90614) with an error of 0.5°C. If the temperature falls in the range of the desired value (range of normal body temperature), then the attendance of the student is marked as present, stored in the database, and the door opens automatically. The attendance is marked as absent in the other case, alerted with the display of temperature, and the door remains closed. The door is automated with the help of a servomotor. To avoid the proxy, IR sensors are used to count the number of students in the classroom. The hardware system consisting of a contactless temperature sensor and IR sensor is implemented on the microcontroller, NodeMCU.

Keywords: NodeMCU, IR sensor, attendance monitoring, contactless, temperature

Procedia PDF Downloads 185
2627 On Exploring Search Heuristics for improving the efficiency in Web Information Extraction

Authors: Patricia Jiménez, Rafael Corchuelo

Abstract:

Nowadays the World Wide Web is the most popular source of information that relies on billions of on-line documents. Web mining is used to crawl through these documents, collect the information of interest and process it by applying data mining tools in order to use the gathered information in the best interest of a business, what enables companies to promote theirs. Unfortunately, it is not easy to extract the information a web site provides automatically when it lacks an API that allows to transform the user-friendly data provided in web documents into a structured format that is machine-readable. Rule-based information extractors are the tools intended to extract the information of interest automatically and offer it in a structured format that allow mining tools to process it. However, the performance of an information extractor strongly depends on the search heuristic employed since bad choices regarding how to learn a rule may easily result in loss of effectiveness and/or efficiency. Improving search heuristics regarding efficiency is of uttermost importance in the field of Web Information Extraction since typical datasets are very large. In this paper, we employ an information extractor based on a classical top-down algorithm that uses the so-called Information Gain heuristic introduced by Quinlan and Cameron-Jones. Unfortunately, the Information Gain relies on some well-known problems so we analyse an intuitive alternative, Termini, that is clearly more efficient; we also analyse other proposals in the literature and conclude that none of them outperforms the previous alternative.

Keywords: information extraction, search heuristics, semi-structured documents, web mining.

Procedia PDF Downloads 335
2626 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 146
2625 Volarization of Sugarcane Bagasse: The Effect of Alkali Concentration, Soaking Time and Temperature on Fibre Yield

Authors: Tamrat Tesfaye, Tilahun Seyoum, K. Shabaridharan

Abstract:

The objective of this paper was to determine the effect of NaOH concentration, soaking time, soaking temperature and their interaction on percentage yield of fibre extract using Response Surface Methodology (RSM). A Box-Behnken design was employed to optimize the extraction process of cellulosic fibre from sugar cane by-product bagasse using low alkaline extraction technique. The quadratic model with the optimal technological conditions resulted in a maximum fibre yield of 56.80% at 0.55N NaOH concentration, 4 h steeping time and 60ᵒC soaking temperature. Among the independent variables concentration was found to be the most significant (P < 0.005) variable and the interaction effect of concentration and soaking time leads to securing the optimized processes.

Keywords: sugarcane bagasse, low alkaline, Box-Behnken, fibre

Procedia PDF Downloads 246
2624 The Outcome of Using Machine Learning in Medical Imaging

Authors: Adel Edwar Waheeb Louka

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery

Procedia PDF Downloads 73
2623 DCDNet: Lightweight Document Corner Detection Network Based on Attention Mechanism

Authors: Kun Xu, Yuan Xu, Jia Qiao

Abstract:

The document detection plays an important role in optical character recognition and text analysis. Because the traditional detection methods have weak generalization ability, and deep neural network has complex structure and large number of parameters, which cannot be well applied in mobile devices, this paper proposes a lightweight Document Corner Detection Network (DCDNet). DCDNet is a two-stage architecture. The first stage with Encoder-Decoder structure adopts depthwise separable convolution to greatly reduce the network parameters. After introducing the Feature Attention Union (FAU) module, the second stage enhances the feature information of spatial and channel dim and adaptively adjusts the size of receptive field to enhance the feature expression ability of the model. Aiming at solving the problem of the large difference in the number of pixel distribution between corner and non-corner, Weighted Binary Cross Entropy Loss (WBCE Loss) is proposed to define corner detection problem as a classification problem to make the training process more efficient. In order to make up for the lack of Dataset of document corner detection, a Dataset containing 6620 images named Document Corner Detection Dataset (DCDD) is made. Experimental results show that the proposed method can obtain fast, stable and accurate detection results on DCDD.

Keywords: document detection, corner detection, attention mechanism, lightweight

Procedia PDF Downloads 354
2622 Producing Lutein Powder from Algae by Extraction and Drying

Authors: Zexin Lei, Timothy Langrish

Abstract:

Lutein is a type of carotene believed to be beneficial to the eyes. This study aims to explore the possibility of using a closed cycle spray drying system to produce lutein. The system contains a spray dryer, a condenser, a heater, and a pressure seal. Hexane, ethanol, and isopropanol will be used as organic solvents to compare the extraction effects. Several physical and chemical methods of cell disruption will be compared. By continuously sweeping the system with nitrogen, the oxygen content will be controlled below 2%, reducing the concentration of organic solvent below the explosion limit and preventing lutein from being oxidized. Lutein powder will be recovered in the collection device. The volatile organic solvent will be cooled in the condenser and deposited in the bottom until it is discharged from the bottom of the condenser.

Keywords: closed cycle spray drying system, Chlorella vulgaris, organic solvent, solvent recovery

Procedia PDF Downloads 137
2621 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 142
2620 Transfer of Constraints or Constraints on Transfer? Syntactic Islands in Danish L2 English

Authors: Anne Mette Nyvad, Ken Ramshøj Christensen

Abstract:

In the syntax literature, it has standardly been assumed that relative clauses and complement wh-clauses are islands for extraction in English, and that constraints on extraction from syntactic islands are universal. However, the Mainland Scandinavian languages has been known to provide counterexamples. Previous research on Danish has shown that neither relative clauses nor embedded questions are strong islands in Danish. Instead, extraction from this type of syntactic environment is degraded due to structural complexity and it interacts with nonstructural factors such as the frequency of occurrence of the matrix verb, the possibility of temporary misanalysis leading to semantic incongruity and exposure over time. We argue that these facts can be accounted for with parametric variation in the availability of CP-recursion, resulting in the patterns observed, as Danish would then “suspend” the ban on movement out of relative clauses and embedded questions. Given that Danish does not seem to adhere to allegedly universal syntactic constraints, such as the Complex NP Constraint and the Wh-Island Constraint, what happens in L2 English? We present results from a study investigating how native Danish speakers judge extractions from island structures in L2 English. Our findings suggest that Danes transfer their native language parameter setting when asked to judge island constructions in English. This is compatible with the Full Transfer Full Access Hypothesis, as the latter predicts that Danish would have difficulties resetting their [+/- CP-recursion] parameter in English because they are not exposed to negative evidence.

Keywords: syntax, islands, second language acquisition, danish

Procedia PDF Downloads 127
2619 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance

Authors: Sokkhey Phauk, Takeo Okazaki

Abstract:

The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.

Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance

Procedia PDF Downloads 106
2618 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning

Procedia PDF Downloads 446
2617 Agent-Base Modeling of IoT Applications by Using Software Product Line

Authors: Asad Abbas, Muhammad Fezan Afzal, Muhammad Latif Anjum, Muhammad Azmat

Abstract:

The Internet of Things (IoT) is used to link up real objects that use the internet to interact. IoT applications allow handling and operating the equipment in accordance with environmental needs, such as transportation and healthcare. IoT devices are linked together via a number of agents that act as a middleman for communications. The operation of a heat sensor differs indoors and outside because agent applications work with environmental variables. In this article, we suggest using Software Product Line (SPL) to model IoT agents and applications' features on an XML-based basis. The contextual diversity within the same domain of application can be handled, and the reusability of features is increased by XML-based feature modelling. For the purpose of managing contextual variability, we have embraced XML for modelling IoT applications, agents, and internet-connected devices.

Keywords: IoT agents, IoT applications, software product line, feature model, XML

Procedia PDF Downloads 94
2616 Comparative Analysis of Oil Extracts from Cotton and Watermelon Seeds

Authors: S. A. Jumare, A. O. Tijani, M. F. Siraj, B. V. Babatunde

Abstract:

This research investigated the comparative analysis of oil extracted from cotton and watermelon seeds using solvent extraction process. Normal ethyl-ether was used as solvent in the extraction process. The AOAC method of Analysis was employed in the determination of the physiochemical properties of the oil. The chemical properties of the oil determined include the saponification value, free fatty acid, iodine value, peroxide value and acid value. The physical properties of the oil determined include specific gravity, refractive index, colour, odour, taste and pH. The value obtained for cottonseed oil are saponification value (187mgKOH/g), free fatty acid (5.64mgKOH/g), iodine value (95.2g/100), peroxide value (9.33meq/kg), acid value (11.22mg/KOH/g), pH value (4.62), refractive index (1.46), and specific gravity (0.9) respectively, it has a bland odour, a reddish brown colour and a mild taste. The values obtained for watermelon seed oil are saponification value (83.3mgKOH/g), free fatty acid (6.58mg/KOH/g), iodine value (122.6g/100), peroxide value (5.3meq/kg), acid value (3.74mgKOH/g), pH value (6.3), refractive index (1.47), and specific gravity (0.9) respectively, it has a nutty flavour, a golden yellow colour and a mild taste. From the result obtained, it shows that cottonseed oil has high acid value which shows the stability of the oil and its stability to rancidity. Consequently, watermelon seed oil is order wise.

Keywords: extraction, solvent, cotton seeds, watermelon seeds

Procedia PDF Downloads 363
2615 Competition in Petroleum Extraction and the Challenges of Climate Change

Authors: Saeid Rabiei Majd, Motahareh Alvandi, Bahareh Asefi

Abstract:

Extraction of maximum natural resources is one of the common policies of governments, especially petroleum resources that have high economic and strategic value. The incentive to access and maintain profitable oil markets for governments or international oil companies, causing neglects them to pay attention to environmental principles and sustainable development, which in turn drives up environmental and climate change. Significant damage to the environment can cause severe damage to citizens and indigenous people, such as the compulsory evacuation of their zone due to contamination of water and air resources, destruction of animals and plants. Hawizeh Marshes is a common aquatic and environmental ecosystem along the Iran-Iraq border that also has oil resources. This marsh has been very rich in animal, vegetative, and oil resources. Since 1990, the political motives, the strategic importance of oil extraction, and the disregard for the environmental rights of the Iraqi and Iranian governments in the region have caused 90% of the marshes and forced migration of indigenous people. In this paper, we examine the environmental degradation factors resulting from the adoption of policies and practices of governments in this region based on the principles of environmental rights and sustainable development. Revision of the implementation of the government’s policies and natural resource utilization systems can prevent the spread of climate change, which is a serious international challenge today.

Keywords: climate change, indigenous rights, petroleum operation, sustainable development principles, sovereignty on resources

Procedia PDF Downloads 112