Search results for: automated teller machine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3483

Search results for: automated teller machine

933 Orthophthalic Polyester Composite Reinforced with Sodium Alginate-Treated Anahaw (Saribus rotundifolius) Fibers

Authors: Terence Tumolva, Johannes Kristoff Vito, Joanna Crystelle Ragasa, Renz Marion Dela Cruz

Abstract:

Natural fiber reinforced polymer (NFRP) composites have been the focus of various research projects due to their advantages over synthetic fiber-reinforced composites. For this study, ana haw is used as the fiber source due to its abundance throughout the Philippines. A problem addressed in this study is the need for an environment-friendly method of fiber treatment. The use of sodium alginate to treat fibers was thus investigated. The fibers were immersed in a sodium alginate solution and then in a calcium chloride solution afterwards. The treated fibers were used to reinforce orthophthalic unsaturated polyester (ortho-UP) resin. The mechanical properties were tested using a universal testing machine (UTM), and the fracture surfaces were characterized using scanning electron microscope (SEM). Results showed that the sodium alginate treatment had increased the tensile and flexural strength of the composite. The increase in fiber load had also been found to increase the stiffness of the composite. However, sodium alginate treatment did not provide any significant improvement in the wet mechanical properties of the NFRP. The composite is comparable to some commercially available polymeric materials.

Keywords: NFRP, composite, alginate, anahaw, polymer

Procedia PDF Downloads 310
932 Implementing a Neural Network on a Low-Power and Mobile Cluster to Aide Drivers with Predictive AI for Traffic Behavior

Authors: Christopher Lama, Alix Rieser, Aleksandra Molchanova, Charles Thangaraj

Abstract:

New technologies like Tesla’s Dojo have made high-performance embedded computing more available. Although automobile computing has developed and benefited enormously from these more recent technologies, the costs are still high, prohibitively high in some cases for broader adaptation, particularly for the after-market and enthusiast markets. This project aims to implement a Raspberry Pi-based low-power (under one hundred Watts) highly mobile computing cluster for a neural network. The computing cluster built from off-the-shelf components is more affordable and, therefore, makes wider adoption possible. The paper describes the design of the neural network, Raspberry Pi-based cluster, and applications the cluster will run. The neural network will use input data from sensors and cameras to project a live view of the road state as the user drives. The neural network will be trained to predict traffic behavior and generate warnings when potentially dangerous situations are predicted. The significant outcomes of this study will be two folds, firstly, to implement and test the low-cost cluster, and secondly, to ascertain the effectiveness of the predictive AI implemented on the cluster.

Keywords: CS pedagogy, student research, cluster computing, machine learning

Procedia PDF Downloads 68
931 Urinalysis by Surface-Enhanced Raman Spectroscopy on Gold Nanoparticles for Different Disease

Authors: Leonardo C. Pacheco-Londoño, Nataly J. Galan-Freyle, Lisandro Pacheco-Lugo, Antonio Acosta, Elkin Navarro, Gustavo Aroca-Martínez, Karin Rondón-Payares, Samuel P. Hernández-Rivera

Abstract:

In our Life Science Research Center of the University Simon Bolivar (LSRC), one of the focuses is the diagnosis and prognosis of different diseases; we have been implementing the use of gold nanoparticles (Au-NPs) for various biomedical applications. In this case, Au-NPs were used for Surface-Enhanced Raman Spectroscopy (SERS) in different diseases' diagnostics, such as Lupus Nephritis (LN), hypertension (H), preeclampsia (PC), and others. This methodology is proposed for the diagnosis of each disease. First, good signals of the different metabolites by SERS were obtained through a mixture of urine samples and Au-NPs. Second, PLS-DA models based on SERS spectra to discriminate each disease were able to differentiate between sick and healthy patients with different diseases. Finally, the sensibility and specificity for the different models were determined in the order of 0.9. On the other hand, a second methodology was developed using machine learning models from all data of the different diseases, and, as a result, a discriminant spectral map of the diseases was generated. These studies were possible thanks to joint research between two university research centers and two health sector entities, and the patient samples were treated with ethical rigor and their consent.

Keywords: SERS, Raman, PLS-DA, diseases

Procedia PDF Downloads 105
930 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning

Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher

Abstract:

Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.

Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping

Procedia PDF Downloads 103
929 Comparative Study of Impact Strength and Fracture Morphological of Nano-CaCO3 and Nanoclay Reinforced HDPE Nanocomposites

Authors: Harun Sepet, Necmettin Tarakcioglu

Abstract:

The present study investigated the impact strength and fracture mechanism of nano-CaCO3 and nanoclay reinforced HDPE nanocomposites by using Charpy impact test. The nano-CaCO3 and nanoclay reinforced HDPE granules were prepared by the melt blending method using a compounder system, which consists of industrial banbury mixer, single screw extruder and granule cutting in industrial-scale. The nano-CaCO3 and nanoclay reinforced HDPE granules were molded using an injection-molding machine as plates, and then impact samples were cut by using punching die from the nanocomposite plates. As a result of impact experiments, nano-CaCO3 and nanoclay reinforced HDPE nanocomposites were determined to have lower impact energy level than neat HDPE. Also, the impact strength of HDPE further decreased by addition nanoclay compared to nano-CaCO3. The occurred fracture areas with the impact were detected by SEM examination. It is understood that fracture surface morphology changes when nano-CaCO3 and nanoclay ratio increases. The fracture surface changes were examined to determine the fracture mechanism of nano-CaCO3 and nanoclay reinforced HDPE nanocomposites.

Keywords: charpy, HDPE, industrial scale nano-CaCO3, nanoclay, nanocomposite

Procedia PDF Downloads 384
928 Development of a Highly Flexible, Sensitive and Stretchable Polymer Nanocomposite for Strain Sensing

Authors: Shaghayegh Shajari, Mehdi Mahmoodi, Mahmood Rajabian, Uttandaraman Sundararaj, Les J. Sudak

Abstract:

Although several strain sensors based on carbon nanotubes (CNTs) have been reported, the stretchability and sensitivity of these sensors have remained as a challenge. Highly stretchable and sensitive strain sensors are in great demand for human motion monitoring and human-machine interface. This paper reports the fabrication and characterization of a new type of strain sensors based on a stretchable fluoropolymer / CNT nanocomposite system made via melt-mixing technique. Electrical and mechanical characterizations were obtained. The results showed that this nanocomposite sensor has high stretchability up to 280% of strain at an optimum level of filler concentration. The piezoresistive properties and the strain sensing mechanism of the strain sensor were investigated using Electrochemical Impedance Spectroscopy (EIS). High sensitivity was obtained (gauge factor as large as 12000 under 120% applied strain) in particular at the concentrations above the percolation threshold. Due to the tunneling effect, a non- linear piezoresistivity was observed at high concentrations of CNT loading. The nanocomposites with good conductivity and lightweight could be a promising candidate for strain sensing applications.

Keywords: carbon nanotubes, fluoropolymer, piezoresistive, strain sensor

Procedia PDF Downloads 275
927 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point

Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee

Abstract:

Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.

Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis

Procedia PDF Downloads 323
926 Feature Extraction and Impact Analysis for Solid Mechanics Using Supervised Finite Element Analysis

Authors: Edward Schwalb, Matthias Dehmer, Michael Schlenkrich, Farzaneh Taslimi, Ketron Mitchell-Wynne, Horen Kuecuekyan

Abstract:

We present a generalized feature extraction approach for supporting Machine Learning (ML) algorithms which perform tasks similar to Finite-Element Analysis (FEA). We report results for estimating the Head Injury Categorization (HIC) of vehicle engine compartments across various impact scenarios. Our experiments demonstrate that models learned using features derived with a simple discretization approach provide a reasonable approximation of a full simulation. We observe that Decision Trees could be as effective as Neural Networks for the HIC task. The simplicity and performance of the learned Decision Trees could offer a trade-off of a multiple order of magnitude increase in speed and cost improvement over full simulation for a reasonable approximation. When used as a complement to full simulation, the approach enables rapid approximate feedback to engineering teams before submission for full analysis. The approach produces mesh independent features and is further agnostic of the assembly structure.

Keywords: mechanical design validation, FEA, supervised decision tree, convolutional neural network.

Procedia PDF Downloads 110
925 Deformation Behavior of Virgin and Polypropylene Modified Bituminous Mixture

Authors: Noor Zainab Habib, Ibrahim Kamaruddin, Madzlan Napiah

Abstract:

This paper present a part of research conducted to investigate the creep behavior of bituminous concrete mixture prepared with well graded using the dynamic creep test. The samples were prepared from unmodified control mix and Polypropylene modified bituminous mix. Unmodified or control mix was prepared with 80/100 grade bitumen while polypropylene modified mix was prepared using polypropylene PP polymer as modifier, blended with 80/100 Pen bitumen. The concentration of polymer in the blend was kept at 1%, 2%, and 3% by weight of bitumen content. For Dynamic Creep Test, Marshall Specimen were prepared at optimum bitumen content and then tested using IPC Global Universal Testing Machine (UTM), in order to investigate the creep stiffness of both modified and control mix. From the results obtained it was found that 1% and 2% PP modified bituminous mix offer better results in comparison to control and 3% PP modified mix samples. The results verify all the findings of empirical and viscosity test results which indicates that polymer modification induces stiffening effect in the binder. Enhanced viscous component of the binder was considered responsible for this change which eventually enhances the mechanical strength of the modified bituminous mixes.

Keywords: polymer modified bitumen, stiffness, creep, viscosity

Procedia PDF Downloads 396
924 Human Action Recognition Using Wavelets of Derived Beta Distributions

Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel

Abstract:

In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.

Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet

Procedia PDF Downloads 386
923 A Unified Approach for Digital Forensics Analysis

Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles

Abstract:

Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.

Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool

Procedia PDF Downloads 165
922 Hyperspectral Image Classification Using Tree Search Algorithm

Authors: Shreya Pare, Parvin Akhter

Abstract:

Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.

Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm

Procedia PDF Downloads 142
921 System and Method for Providing Web-Based Remote Application Service

Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang

Abstract:

With the development of virtualization technologies, a new type of service named cloud computing service is produced. Cloud users usually encounter the problem of how to use the virtualized platform easily over the web without requiring the plug-in or installation of special software. The object of this paper is to develop a system and a method enabling process interfacing within an automation scenario for accessing remote application by using the web browser. To meet this challenge, we have devised a web-based interface that system has allowed to shift the GUI application from the traditional local environment to the cloud platform, which is stored on the remote virtual machine. We designed the sketch of web interface following the cloud virtualization concept that sought to enable communication and collaboration among users. We describe the design requirements of remote application technology and present implementation details of the web application and its associated components. We conclude that this effort has the potential to provide an elastic and resilience environment for several application services. Users no longer have to burden the system maintenances and reduce the overall cost of software licenses and hardware. Moreover, this remote application service represents the next step to the mobile workplace, and it lets user to use the remote application virtually from anywhere.

Keywords: virtualization technology, virtualized platform, web interface, remote application

Procedia PDF Downloads 255
920 Effect of Coffee Grounds on Physical and Heating Value Properties of Sugarcane Bagasse Pellets

Authors: K. Rattawan, W. Intagun, W. Kanoksilapatham

Abstract:

Objective of this research is to study effect of coffee grounds on physical and heating value properties of sugarcane bagasse pellets. The coffee grounds were tested as an additive for pelletizing process of bagasse pellets. Pelletizing was performed using a Flat–die pellet mill machine. Moisture content of raw materials was controlled at 10-13%. Die temperature range during the process was 75-80 oC. Physical characteristics (bulk density and durability) of the bagasse pellet and pellets with 1-5% coffee ground were determined following the standard assigned by the Pellet Fuel Institute (PFI). The results revealed increasing values of 648±3.4, 659 ± 3.1, 679 ± 3.3 and 685 ± 3.1 kg/m3 (for pellet bulk density); and 98.7 ± 0.11, 99.2 ± 0.26, 99.3 ± 0.19 and 99.4 ± 0.07% (for pellet durability), respectively. In addition, the heating values of the coffee ground supplemented pellets (15.9 ± 1.16, 17.0 ± 1.23 and 18.8 ± 1.34 MJ/kg) were improved comparing to the non-supplemented control (14.9 ± 1.14 MJ/kg), respectively. The results indicated that both the bulk density and durability values of the bagasse pellets were increased with the increasing proportion of the coffee ground additive.

Keywords: bagasse, coffee grounds, pelletizing, heating value, sugar cane bagasse

Procedia PDF Downloads 145
919 Development of a 3D Model of Real Estate Properties in Fort Bonifacio, Taguig City, Philippines Using Geographic Information Systems

Authors: Lyka Selene Magnayi, Marcos Vinas, Roseanne Ramos

Abstract:

As the real estate industry continually grows in the Philippines, Geographic Information Systems (GIS) provide advantages in generating spatial databases for efficient delivery of information and services. The real estate sector is not only providing qualitative data about real estate properties but also utilizes various spatial aspects of these properties for different applications such as hazard mapping and assessment. In this study, a three-dimensional (3D) model and a spatial database of real estate properties in Fort Bonifacio, Taguig City are developed using GIS and SketchUp. Spatial datasets include political boundaries, buildings, road network, digital terrain model (DTM) derived from Interferometric Synthetic Aperture Radar (IFSAR) image, Google Earth satellite imageries, and hazard maps. Multiple model layers were created based on property listings by a partner real estate company, including existing and future property buildings. Actual building dimensions, building facade, and building floorplans are incorporated in these 3D models for geovisualization. Hazard model layers are determined through spatial overlays, and different scenarios of hazards are also presented in the models. Animated maps and walkthrough videos were created for company presentation and evaluation. Model evaluation is conducted through client surveys requiring scores in terms of the appropriateness, information content, and design of the 3D models. Survey results show very satisfactory ratings, with the highest average evaluation score equivalent to 9.21 out of 10. The output maps and videos obtained passing rates based on the criteria and standards set by the intended users of the partner real estate company. The methodologies presented in this study were found useful and have remarkable advantages in the real estate industry. This work may be extended to automated mapping and creation of online spatial databases for better storage, access of real property listings and interactive platform using web-based GIS.

Keywords: geovisualization, geographic information systems, GIS, real estate, spatial database, three-dimensional model

Procedia PDF Downloads 139
918 Experimental Investigation of Damaged Reinforced Concrete Beams Repaired with Carbon Fibre Reinforced Polymer (CFRP) Strip under Impact Loading

Authors: M. Al-Farttoosi, M. Y. Rafiq, J. Summerscales, C. Williams

Abstract:

Many buildings and bridges are damaged due to impact loading, explosions, terrorist attacks and wars. Most of the damaged structures members such as beams, columns and slabs are not totally failed and it can be repaired. Nowadays, carbon fibre reinforced polymer CFRP has been wildly used in strengthening and retrofitting the structures members. CFRP can rector the load carrying capacity of the damaged structures members to make them serviceable. An experimental investigation was conducted to investigate the impact behaviour of the damaged beams repaired with CFRP. The tested beams had different degrees of damage and near surface mounted technique NSM was used to install the CFRP. A heavy drop weight impact test machine was used to conduct the experimental work. The study investigated the impact strength, stiffness, cracks and deflection of the CFRP repaired beams. The results show that CFRP significantly increased the impact resistance of the damaged beams. CFRP increased the damaged beams stiffness and reduced the deflection. The results showed that the NSM technique is more effective in repairing beams and preventing the debonding of the CFRP.

Keywords: damaged, concrete, impact, repaired

Procedia PDF Downloads 317
917 Sorting Maize Haploids from Hybrids Using Single-Kernel Near-Infrared Spectroscopy

Authors: Paul R Armstrong

Abstract:

Doubled haploids (DHs) have become an important breeding tool for creating maize inbred lines, although several bottlenecks in the DH production process limit wider development, application, and adoption of the technique. DH kernels are typically sorted manually and represent about 10% of the seeds in a much larger pool where the remaining 90% are hybrid siblings. This introduces time constraints on DH production and manual sorting is often not accurate. Automated sorting based on the chemical composition of the kernel can be effective, but devices, namely NMR, have not achieved the sorting speed to be a cost-effective replacement to manual sorting. This study evaluated a single kernel near-infrared reflectance spectroscopy (skNIR) platform to accurately identify DH kernels based on oil content. The skNIR platform is a higher-throughput device, approximately 3 seeds/s, that uses spectra to predict oil content of each kernel from maize crosses intentionally developed to create larger than normal oil differences, 1.5%-2%, between DH and hybrid kernels. Spectra from the skNIR were used to construct a partial least squares regression (PLS) model for oil and for a categorical reference model of 1 (DH kernel) or 2 (hybrid kernel) and then used to sort several crosses to evaluate performance. Two approaches were used for sorting. The first used a general PLS model developed from all crosses to predict oil content and then used for sorting each induction cross, the second was the development of a specific model from a single induction cross where approximately fifty DH and one hundred hybrid kernels used. This second approach used a categorical reference value of 1 and 2, instead of oil content, for the PLS model and kernels selected for the calibration set were manually referenced based on traditional commercial methods using coloration of the tip cap and germ areas. The generalized PLS oil model statistics were R2 = 0.94 and RMSE = .93% for kernels spanning an oil content of 2.7% to 19.3%. Sorting by this model resulted in extracting 55% to 85% of haploid kernels from the four induction crosses. Using the second method of generating a model for each cross yielded model statistics ranging from R2s = 0.96 to 0.98 and RMSEs from 0.08 to 0.10. Sorting in this case resulted in 100% correct classification but required models that were cross. In summary, the first generalized model oil method could be used to sort a significant number of kernels from a kernel pool but was not close to the accuracy of developing a sorting model from a single cross. The penalty for the second method is that a PLS model would need to be developed for each individual cross. In conclusion both methods could find useful application in the sorting of DH from hybrid kernels.

Keywords: NIR, haploids, maize, sorting

Procedia PDF Downloads 281
916 FEM and Experimental Modal Analysis of Computer Mount

Authors: Vishwajit Ghatge, David Looper

Abstract:

Over the last few decades, oilfield service rolling equipment has significantly increased in weight, primarily because of emissions regulations, which require larger/heavier engines, larger cooling systems, and emissions after-treatment systems, in some cases, etc. Larger engines cause more vibration and shock loads, leading to failure of electronics and control systems. If the vibrating frequency of the engine matches the system frequency, high resonance is observed on structural parts and mounts. One such existing automated control equipment system comprising wire rope mounts used for mounting computers was designed approximately 12 years ago. This includes the use of an industrial- grade computer to control the system operation. The original computer had a smaller, lighter enclosure. After a few years, a newer computer version was introduced, which was 10 lbm heavier. Some failures of internal computer parts have been documented for cases in which the old mounts were used. Because of the added weight, there is a possibility of having the two brackets impact each other under off-road conditions, which causes a high shock input to the computer parts. This added failure mode requires validating the existing mount design to suit the new heavy-weight computer. This paper discusses the modal finite element method (FEM) analysis and experimental modal analysis conducted to study the effects of vibration on the wire rope mounts and the computer. The existing mount was modelled in ANSYS software, and resultant mode shapes and frequencies were obtained. The experimental modal analysis was conducted, and actual frequency responses were observed and recorded. Results clearly revealed that at resonance frequency, the brackets were colliding and potentially causing damage to computer parts. To solve this issue, spring mounts of different stiffness were modeled in ANSYS software, and the resonant frequency was determined. Increasing the stiffness of the system increased the resonant frequency zone away from the frequency window at which the engine showed heavy vibrations or resonance. After multiple iterations in ANSYS software, the stiffness of the spring mount was finalized, which was again experimentally validated.

Keywords: experimental modal analysis, FEM Modal Analysis, frequency, modal analysis, resonance, vibration

Procedia PDF Downloads 303
915 Blend of Polyamide 6 with Polybutylene Terephthalate Compatibilized with Epoxidized Natural Rubber (ENR-25) and N Butyl Acrylate Glycidyl Methacrylate Ethylene (EBa-GMA)

Authors: Ramita Vongrat, Pornsri Sapsrithong, Manit Nithitanakul

Abstract:

In this work, blends of polyamide 6 (PA6) and polybutylene terephthalate (PBT) were successfully prepared. The effect of epoxidized natural rubber (ENR-25) and n butyl acrylate glycidyl methacrylate ethylene (EBa-GMA) as a compatibilizer on properties of PA6/PBT blends was also investigated by varying amount of ENR-50 and EBa-GMA, i.e., 0, 0.1, 0.5, 5 and 10 phr. All blends were prepared and shaped by using twin-screw extruder at 230 °C and injection molding machine, respectively. All test specimens were characterized by phase morphology, impact strength, tensile, flexural properties, and hardness. The results exhibited that phase morphology of PA6/PBT blend without compatibilizer was incompatible. This could be attributed to poor interfacial adhesion between the two polymers. SEM micrographs showed that the addition of ENR-25 and EBa-GMA improved the compatibility of PA6/PBT blends. With the addition of ENR-50 as a compatibilizer, the uniformity and the maximum reduction of dispersed phase size were observed. Additionally, the results indicate that, as the amount of ENR-25 increased, and EBa-GMA increased, the mechanical properties, including stress at the peak, tensile modulus, and izod impact strength, were also improved.

Keywords: EBa-GMA, epoxidized natural rubber-25, polyamide 6, polybutylene terephthalate

Procedia PDF Downloads 146
914 Synthetic Aperture Radar Remote Sensing Classification Using the Bag of Visual Words Model to Land Cover Studies

Authors: Reza Mohammadi, Mahmod R. Sahebi, Mehrnoosh Omati, Milad Vahidi

Abstract:

Classification of high resolution polarimetric Synthetic Aperture Radar (PolSAR) images plays an important role in land cover and land use management. Recently, classification algorithms based on Bag of Visual Words (BOVW) model have attracted significant interest among scholars and researchers in and out of the field of remote sensing. In this paper, BOVW model with pixel based low-level features has been implemented to classify a subset of San Francisco bay PolSAR image, acquired by RADARSAR 2 in C-band. We have used segment-based decision-making strategy and compared the result with the result of traditional Support Vector Machine (SVM) classifier. 90.95% overall accuracy of the classification with the proposed algorithm has shown that the proposed algorithm is comparable with the state-of-the-art methods. In addition to increase in the classification accuracy, the proposed method has decreased undesirable speckle effect of SAR images.

Keywords: Bag of Visual Words (BOVW), classification, feature extraction, land cover management, Polarimetric Synthetic Aperture Radar (PolSAR)

Procedia PDF Downloads 181
913 Application of Rapid Prototyping to Create Additive Prototype Using Computer System

Authors: Meftah O. Bashir, Fatma A. Karkory

Abstract:

Rapid prototyping is a new group of manufacturing processes, which allows fabrication of physical of any complexity using a layer by layer deposition technique directly from a computer system. The rapid prototyping process greatly reduces the time and cost necessary to bring a new product to market. The prototypes made by these systems are used in a range of industrial application including design evaluation, verification, testing, and as patterns for casting processes. These processes employ a variety of materials and mechanisms to build up the layers to build the part. The present work was to build a FDM prototyping machine that could control the X-Y motion and material deposition, to generate two-dimensional and three-dimensional complex shapes. This study focused on the deposition of wax material. This work was to find out the properties of the wax materials used in this work in order to enable better control of the FDM process. This study will look at the integration of a computer controlled electro-mechanical system with the traditional FDM additive prototyping process. The characteristics of the wax were also analysed in order to optimize the model production process. These included wax phase change temperature, wax viscosity and wax droplet shape during processing.

Keywords: rapid prototyping, wax, manufacturing processes, shape

Procedia PDF Downloads 440
912 Mechanism of Sinkhole Development on Water-Bearing Soft Ground Tunneling

Authors: H. J. Kim, K. H. Kim, N. H. Park, K. T. Nam, Y. H. Jung, T. H. Kim, J. H. Shin

Abstract:

Underground excavations in an urban area can cause various geotechnical problems such as ground loss and lowering of groundwater level. When the ground loss becomes uncontrollably large, sinkholes can be developed to the ground surface. A sinkhole is commonly known as the natural phenomenon associated with lime rock areas. However, sinkholes in urban areas due to pressurized sewers and/or tunneling are also frequently reported. In this study, mechanism of a sinkhole developed at the site ‘A’ where a tunneling work underwent is investigated. The sinkhole occurred in the sand strata with the high level of groundwater when excavating a tunnel of which diameter is 3.6 m. The sinkhole was progressed in two steps. The first step began with the local failure around the tunnel face followed by tons of groundwater inflow, and the second step was triggered by the TBM (Tunnel Boring Machine) chamber opening which led to the progressive general failure. The possibility of the sinkhole was evaluated by using Limit Equilibrium Method (LEM), and critical height was evaluated by the empirical stability chart. It is found that the lowering of the face pressure and inflow of groundwater into the tunnel face turned to be the main reason for the sinkhole.

Keywords: limit equilibrium method, sinkhole, stability chart, tunneling

Procedia PDF Downloads 219
911 Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal

Authors: Belayneh Matebie, Michael Melese

Abstract:

The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms.

Keywords: Teff, ensemble learning, max-voting, CNN, SVM, RF

Procedia PDF Downloads 13
910 Deep Learning and Accurate Performance Measure Processes for Cyber Attack Detection among Web Logs

Authors: Noureddine Mohtaram, Jeremy Patrix, Jerome Verny

Abstract:

As an enormous number of online services have been developed into web applications, security problems based on web applications are becoming more serious now. Most intrusion detection systems rely on each request to find the cyber-attack rather than on user behavior, and these systems can only protect web applications against known vulnerabilities rather than certain zero-day attacks. In order to detect new attacks, we analyze the HTTP protocols of web servers to divide them into two categories: normal attacks and malicious attacks. On the other hand, the quality of the results obtained by deep learning (DL) in various areas of big data has given an important motivation to apply it to cybersecurity. Deep learning for attack detection in cybersecurity has the potential to be a robust tool from small transformations to new attacks due to its capability to extract more high-level features. This research aims to take a new approach, deep learning to cybersecurity, to classify these two categories to eliminate attacks and protect web servers of the defense sector which encounters different web traffic compared to other sectors (such as e-commerce, web app, etc.). The result shows that by using a machine learning method, a higher accuracy rate, and a lower false alarm detection rate can be achieved.

Keywords: anomaly detection, HTTP protocol, logs, cyber attack, deep learning

Procedia PDF Downloads 182
909 Application of Data Mining Techniques for Tourism Knowledge Discovery

Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee

Abstract:

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

Keywords: classification algorithms, data mining, knowledge discovery, tourism

Procedia PDF Downloads 271
908 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm

Procedia PDF Downloads 424
907 Study of Heat Exchangers in Small Modular Reactors

Authors: Harish Aryal, Roger Hague, Daniel Sotelo, Felipe Astete Salinas

Abstract:

This paper presents a comparative study of different coolants, materials, and temperatures that can affect the effectiveness of heat exchangers that are used in small modular reactors. The corrugated plate heat exchangers were chosen out of different plate options for testing purposes because of their ease of access and better performance than other existing heat exchangers in recent years. SolidWorks enables us to see various results between water coolants and helium coolants acting upon different types of conducting metals, which were selected from different fluids that ultimately satisfied accessibility requirements and were compatible with the software. Though not every element, material, fluid, or method was used in the testing phase, their purpose is to help further research that is to come since the innovation of nuclear power is the future. The tests that were performed are to help better understand the constant necessities that are seen in heat exchangers and through every adjustment see what the breaking points or improvements in the machine are. Depending on consumers and researchers, the results may give further feedback as to show why different types of materials and fluids would be preferred and why it is necessary to keep failures to improve future research.

Keywords: heat exchangers, Solidworks, coolants, small modular reactors, nuclear power, nanofluids, Nusselt number, friction factor, Reynolds number

Procedia PDF Downloads 47
906 One-Step Time Series Predictions with Recurrent Neural Networks

Authors: Vaidehi Iyer, Konstantin Borozdin

Abstract:

Time series prediction problems have many important practical applications, but are notoriously difficult for statistical modeling. Recently, machine learning methods have been attracted significant interest as a practical tool applied to a variety of problems, even though developments in this field tend to be semi-empirical. This paper explores application of Long Short Term Memory based Recurrent Neural Networks to the one-step prediction of time series for both trend and stochastic components. Two types of data are analyzed - daily stock prices, that are often considered to be a typical example of a random walk, - and weather patterns dominated by seasonal variations. Results from both analyses are compared, and reinforced learning framework is used to select more efficient between Recurrent Neural Networks and more traditional auto regression methods. It is shown that both methods are able to follow long-term trends and seasonal variations closely, but have difficulties with reproducing day-to-day variability. Future research directions and potential real world applications are briefly discussed.

Keywords: long short term memory, prediction methods, recurrent neural networks, reinforcement learning

Procedia PDF Downloads 204
905 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network

Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.

Keywords: big data, k-NN, machine learning, traffic speed prediction

Procedia PDF Downloads 334
904 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 501