Search results for: bare machine computing
1086 Segmentation Using Multi-Thresholded Sobel Images: Application to the Separation of Stuck Pollen Grains
Authors: Endrick Barnacin, Jean-Luc Henry, Jimmy Nagau, Jack Molinie
Abstract:
Being able to identify biological particles such as spores, viruses, or pollens is important for health care professionals, as it allows for appropriate therapeutic management of patients. Optical microscopy is a technology widely used for the analysis of these types of microorganisms, because, compared to other types of microscopy, it is not expensive. The analysis of an optical microscope slide is a tedious and time-consuming task when done manually. However, using machine learning and computer vision, this process can be automated. The first step of an automated microscope slide image analysis process is segmentation. During this step, the biological particles are localized and extracted. Very often, the use of an automatic thresholding method is sufficient to locate and extract the particles. However, in some cases, the particles are not extracted individually because they are stuck to other biological elements. In this paper, we propose a stuck particles separation method based on the use of the Sobel operator and thresholding. We illustrate it by applying it to the separation of 813 images of adjacent pollen grains. The method correctly separated 95.4% of these images.Keywords: image segmentation, stuck particles separation, Sobel operator, thresholding
Procedia PDF Downloads 1311085 Models, Resources and Activities of Project Scheduling Problems
Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, José J. Hernández-Flores, Edith Olaco Garcia
Abstract:
The Project Scheduling Problem (PSP) is a generic name given to a whole class of problems in which the best form, time, resources and costs for project scheduling are necessary. The PSP is an application area related to the project management. This paper aims at being a guide to understand PSP by presenting a survey of the general parameters of PSP: the Resources (those elements that realize the activities of a project), and the Activities (set of operations or own tasks of a person or organization); the mathematical models of the main variants of PSP and the algorithms used to solve the variants of the PSP. The project scheduling is an important task in project management. This paper contains mathematical models, resources, activities, and algorithms of project scheduling problems. The project scheduling problem has attracted researchers of the automotive industry, steel manufacturer, medical research, pharmaceutical research, telecommunication, industry, aviation industry, development of the software, manufacturing management, innovation and technology management, construction industry, government project management, financial services, machine scheduling, transportation management, and others. The project managers need to finish a project with the minimum cost and the maximum quality.Keywords: PSP, Combinatorial Optimization Problems, Project Management; Manufacturing Management, Technology Management.
Procedia PDF Downloads 4181084 Orthophthalic Polyester Composite Reinforced with Sodium Alginate-Treated Anahaw (Saribus rotundifolius) Fibers
Authors: Terence Tumolva, Johannes Kristoff Vito, Joanna Crystelle Ragasa, Renz Marion Dela Cruz
Abstract:
Natural fiber reinforced polymer (NFRP) composites have been the focus of various research projects due to their advantages over synthetic fiber-reinforced composites. For this study, ana haw is used as the fiber source due to its abundance throughout the Philippines. A problem addressed in this study is the need for an environment-friendly method of fiber treatment. The use of sodium alginate to treat fibers was thus investigated. The fibers were immersed in a sodium alginate solution and then in a calcium chloride solution afterwards. The treated fibers were used to reinforce orthophthalic unsaturated polyester (ortho-UP) resin. The mechanical properties were tested using a universal testing machine (UTM), and the fracture surfaces were characterized using scanning electron microscope (SEM). Results showed that the sodium alginate treatment had increased the tensile and flexural strength of the composite. The increase in fiber load had also been found to increase the stiffness of the composite. However, sodium alginate treatment did not provide any significant improvement in the wet mechanical properties of the NFRP. The composite is comparable to some commercially available polymeric materials.Keywords: NFRP, composite, alginate, anahaw, polymer
Procedia PDF Downloads 3381083 Field-Free Orbital Hall Current-Induced Deterministic Switching in the MO/Co₇₁Gd₂₉/Ru Structure
Authors: Zelalem Abebe Bekele, Kun Lei, Xiukai Lan, Xiangyu Liu, Hui Wen, Kaiyou Wang
Abstract:
Spin-polarized currents offer an efficient means of manipulating the magnetization of a ferromagnetic layer for big data and neuromorphic computing. Research has shown that the orbital Hall effect (OHE) can produce orbital currents, potentially surpassing the counter spin currents induced by the spin Hall effect. However, it’s essential to note that orbital currents alone cannot exert torque directly on a ferromagnetic layer, necessitating a conversion process from orbital to spin currents. Here, we present an efficient method for achieving perpendicularly magnetized spin-orbit torque (SOT) switching by harnessing the localized orbital Hall current generated from a Mo layer within a Mo/CoGd device. Our investigation reveals a remarkable enhancement in the interface-induced planar Hall effect (PHE) within the Mo/CoGd bilayer, resulting in the generation of a z-polarized planar current for manipulating the magnetization of CoGd layer without the need for an in-plane magnetic field. Furthermore, the Mo layer induces out-of-plane orbital current, boosting the in-plane and out-of-plane spin polarization by converting the orbital current into spin current within the dual-property CoGd layer. At the optimal Mo layer thickness, a low critical magnetization switching current density of 2.51×10⁶ A cm⁻² is achieved. This breakthrough opens avenues for all-electrical control energy-efficient magnetization switching through orbital current, advancing the field of spin-orbitronics.Keywords: spin-orbit torque, orbital hall effect, spin hall current, orbital hall current, interface-generated planar hall current, anisotropic magnetoresistance
Procedia PDF Downloads 571082 Urinalysis by Surface-Enhanced Raman Spectroscopy on Gold Nanoparticles for Different Disease
Authors: Leonardo C. Pacheco-Londoño, Nataly J. Galan-Freyle, Lisandro Pacheco-Lugo, Antonio Acosta, Elkin Navarro, Gustavo Aroca-Martínez, Karin Rondón-Payares, Samuel P. Hernández-Rivera
Abstract:
In our Life Science Research Center of the University Simon Bolivar (LSRC), one of the focuses is the diagnosis and prognosis of different diseases; we have been implementing the use of gold nanoparticles (Au-NPs) for various biomedical applications. In this case, Au-NPs were used for Surface-Enhanced Raman Spectroscopy (SERS) in different diseases' diagnostics, such as Lupus Nephritis (LN), hypertension (H), preeclampsia (PC), and others. This methodology is proposed for the diagnosis of each disease. First, good signals of the different metabolites by SERS were obtained through a mixture of urine samples and Au-NPs. Second, PLS-DA models based on SERS spectra to discriminate each disease were able to differentiate between sick and healthy patients with different diseases. Finally, the sensibility and specificity for the different models were determined in the order of 0.9. On the other hand, a second methodology was developed using machine learning models from all data of the different diseases, and, as a result, a discriminant spectral map of the diseases was generated. These studies were possible thanks to joint research between two university research centers and two health sector entities, and the patient samples were treated with ethical rigor and their consent.Keywords: SERS, Raman, PLS-DA, diseases
Procedia PDF Downloads 1431081 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning
Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher
Abstract:
Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping
Procedia PDF Downloads 1381080 Comparative Study of Impact Strength and Fracture Morphological of Nano-CaCO3 and Nanoclay Reinforced HDPE Nanocomposites
Authors: Harun Sepet, Necmettin Tarakcioglu
Abstract:
The present study investigated the impact strength and fracture mechanism of nano-CaCO3 and nanoclay reinforced HDPE nanocomposites by using Charpy impact test. The nano-CaCO3 and nanoclay reinforced HDPE granules were prepared by the melt blending method using a compounder system, which consists of industrial banbury mixer, single screw extruder and granule cutting in industrial-scale. The nano-CaCO3 and nanoclay reinforced HDPE granules were molded using an injection-molding machine as plates, and then impact samples were cut by using punching die from the nanocomposite plates. As a result of impact experiments, nano-CaCO3 and nanoclay reinforced HDPE nanocomposites were determined to have lower impact energy level than neat HDPE. Also, the impact strength of HDPE further decreased by addition nanoclay compared to nano-CaCO3. The occurred fracture areas with the impact were detected by SEM examination. It is understood that fracture surface morphology changes when nano-CaCO3 and nanoclay ratio increases. The fracture surface changes were examined to determine the fracture mechanism of nano-CaCO3 and nanoclay reinforced HDPE nanocomposites.Keywords: charpy, HDPE, industrial scale nano-CaCO3, nanoclay, nanocomposite
Procedia PDF Downloads 4121079 Development of a Highly Flexible, Sensitive and Stretchable Polymer Nanocomposite for Strain Sensing
Authors: Shaghayegh Shajari, Mehdi Mahmoodi, Mahmood Rajabian, Uttandaraman Sundararaj, Les J. Sudak
Abstract:
Although several strain sensors based on carbon nanotubes (CNTs) have been reported, the stretchability and sensitivity of these sensors have remained as a challenge. Highly stretchable and sensitive strain sensors are in great demand for human motion monitoring and human-machine interface. This paper reports the fabrication and characterization of a new type of strain sensors based on a stretchable fluoropolymer / CNT nanocomposite system made via melt-mixing technique. Electrical and mechanical characterizations were obtained. The results showed that this nanocomposite sensor has high stretchability up to 280% of strain at an optimum level of filler concentration. The piezoresistive properties and the strain sensing mechanism of the strain sensor were investigated using Electrochemical Impedance Spectroscopy (EIS). High sensitivity was obtained (gauge factor as large as 12000 under 120% applied strain) in particular at the concentrations above the percolation threshold. Due to the tunneling effect, a non- linear piezoresistivity was observed at high concentrations of CNT loading. The nanocomposites with good conductivity and lightweight could be a promising candidate for strain sensing applications.Keywords: carbon nanotubes, fluoropolymer, piezoresistive, strain sensor
Procedia PDF Downloads 2961078 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture
Authors: Sajjad Akbar, Rabia Bashir
Abstract:
With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.Keywords: agent based web content mining, content centric networking, information centric networking
Procedia PDF Downloads 4751077 Embedded System of Signal Processing on FPGA: Underwater Application Architecture
Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad
Abstract:
The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing
Procedia PDF Downloads 791076 Feature Extraction and Impact Analysis for Solid Mechanics Using Supervised Finite Element Analysis
Authors: Edward Schwalb, Matthias Dehmer, Michael Schlenkrich, Farzaneh Taslimi, Ketron Mitchell-Wynne, Horen Kuecuekyan
Abstract:
We present a generalized feature extraction approach for supporting Machine Learning (ML) algorithms which perform tasks similar to Finite-Element Analysis (FEA). We report results for estimating the Head Injury Categorization (HIC) of vehicle engine compartments across various impact scenarios. Our experiments demonstrate that models learned using features derived with a simple discretization approach provide a reasonable approximation of a full simulation. We observe that Decision Trees could be as effective as Neural Networks for the HIC task. The simplicity and performance of the learned Decision Trees could offer a trade-off of a multiple order of magnitude increase in speed and cost improvement over full simulation for a reasonable approximation. When used as a complement to full simulation, the approach enables rapid approximate feedback to engineering teams before submission for full analysis. The approach produces mesh independent features and is further agnostic of the assembly structure.Keywords: mechanical design validation, FEA, supervised decision tree, convolutional neural network.
Procedia PDF Downloads 1411075 Deformation Behavior of Virgin and Polypropylene Modified Bituminous Mixture
Authors: Noor Zainab Habib, Ibrahim Kamaruddin, Madzlan Napiah
Abstract:
This paper present a part of research conducted to investigate the creep behavior of bituminous concrete mixture prepared with well graded using the dynamic creep test. The samples were prepared from unmodified control mix and Polypropylene modified bituminous mix. Unmodified or control mix was prepared with 80/100 grade bitumen while polypropylene modified mix was prepared using polypropylene PP polymer as modifier, blended with 80/100 Pen bitumen. The concentration of polymer in the blend was kept at 1%, 2%, and 3% by weight of bitumen content. For Dynamic Creep Test, Marshall Specimen were prepared at optimum bitumen content and then tested using IPC Global Universal Testing Machine (UTM), in order to investigate the creep stiffness of both modified and control mix. From the results obtained it was found that 1% and 2% PP modified bituminous mix offer better results in comparison to control and 3% PP modified mix samples. The results verify all the findings of empirical and viscosity test results which indicates that polymer modification induces stiffening effect in the binder. Enhanced viscous component of the binder was considered responsible for this change which eventually enhances the mechanical strength of the modified bituminous mixes.Keywords: polymer modified bitumen, stiffness, creep, viscosity
Procedia PDF Downloads 4191074 Human Action Recognition Using Wavelets of Derived Beta Distributions
Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel
Abstract:
In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet
Procedia PDF Downloads 4111073 Hyperspectral Image Classification Using Tree Search Algorithm
Authors: Shreya Pare, Parvin Akhter
Abstract:
Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm
Procedia PDF Downloads 1801072 Effect of Coffee Grounds on Physical and Heating Value Properties of Sugarcane Bagasse Pellets
Authors: K. Rattawan, W. Intagun, W. Kanoksilapatham
Abstract:
Objective of this research is to study effect of coffee grounds on physical and heating value properties of sugarcane bagasse pellets. The coffee grounds were tested as an additive for pelletizing process of bagasse pellets. Pelletizing was performed using a Flat–die pellet mill machine. Moisture content of raw materials was controlled at 10-13%. Die temperature range during the process was 75-80 oC. Physical characteristics (bulk density and durability) of the bagasse pellet and pellets with 1-5% coffee ground were determined following the standard assigned by the Pellet Fuel Institute (PFI). The results revealed increasing values of 648±3.4, 659 ± 3.1, 679 ± 3.3 and 685 ± 3.1 kg/m3 (for pellet bulk density); and 98.7 ± 0.11, 99.2 ± 0.26, 99.3 ± 0.19 and 99.4 ± 0.07% (for pellet durability), respectively. In addition, the heating values of the coffee ground supplemented pellets (15.9 ± 1.16, 17.0 ± 1.23 and 18.8 ± 1.34 MJ/kg) were improved comparing to the non-supplemented control (14.9 ± 1.14 MJ/kg), respectively. The results indicated that both the bulk density and durability values of the bagasse pellets were increased with the increasing proportion of the coffee ground additive.Keywords: bagasse, coffee grounds, pelletizing, heating value, sugar cane bagasse
Procedia PDF Downloads 1681071 The Effect of Artificial Intelligence on Accounting and Finance
Authors: Evrime Fawzy Ishak Gadelsayed
Abstract:
This paper presents resource intake accounting as an inventive manner to cope with control accounting, which concentrates on administrators as the crucial customers of the information and offers satisfactory statistics of conventional control accounting. This machine underscores that the association's asset motivates prices; as a consequence, in costing frameworks, the emphasis ought to be on assets and their usage. Resource consumption accounting consolidates two costing methodologies, action-based totally and the German cost accounting approach called GPK. This methodology, however, is a danger to managers when making the management accounting undertaking operational. The motive for this article is to clarify the concept of resource intake accounting, its elements and highlights and use of this approach in associations. Inside the first area, we present useful resource consumption accounting, the basis, reasons for its improvement, and the issues that are faced beyond costing frameworks. At that point, we deliver the requirements and presumptions of this approach; ultimately, we depict the execution of this approach in associations and its preferences over other costing techniques.Keywords: financial statement fraud, forensic accounting, fraud prevention and detection, auditing, audit expectation gap, corporate governance resource consumption accounting, management accounting, action based method, German cost accounting method
Procedia PDF Downloads 181070 Intelligent System of the Grinding Robot for Spiral Welded Pipe
Authors: Getachew Demeissie Ayalew, Yongtao Sun, Yang Yang
Abstract:
The spiral welded pipe manufacturing industry requires strict production standards for automated grinders for welding seams. However, traditional grinding machines in this sector are insufficient due to a lack of quality control protocols and inconsistent performance. This research aims to improve the quality of spiral welded pipes by developing intelligent automated abrasive belt grinding equipment. The system has equipped with six degrees of freedom (6 DOF) KUKA KR360 industrial robots, enabling concurrent grinding operations on both internal and external welds. The grinding robot control system is designed with a PLC, and a human-machine interface (HMI) system is employed for operations. The system includes an electric speed controller, data connection card, DC driver, analog amplifier, and HMI for input data. This control system enables the grinding of spiral welded pipe. It ensures consistent production quality and cost-effectiveness by reducing the product life cycle and minimizing risks in the working environment.Keywords: Intelligent Systems, Spiral Welded Pipe, Grinding, Industrial Robot, End-Effector, PLC Controller System, 3D Laser Sensor, HMI.
Procedia PDF Downloads 2971069 Experimental Investigation of Damaged Reinforced Concrete Beams Repaired with Carbon Fibre Reinforced Polymer (CFRP) Strip under Impact Loading
Authors: M. Al-Farttoosi, M. Y. Rafiq, J. Summerscales, C. Williams
Abstract:
Many buildings and bridges are damaged due to impact loading, explosions, terrorist attacks and wars. Most of the damaged structures members such as beams, columns and slabs are not totally failed and it can be repaired. Nowadays, carbon fibre reinforced polymer CFRP has been wildly used in strengthening and retrofitting the structures members. CFRP can rector the load carrying capacity of the damaged structures members to make them serviceable. An experimental investigation was conducted to investigate the impact behaviour of the damaged beams repaired with CFRP. The tested beams had different degrees of damage and near surface mounted technique NSM was used to install the CFRP. A heavy drop weight impact test machine was used to conduct the experimental work. The study investigated the impact strength, stiffness, cracks and deflection of the CFRP repaired beams. The results show that CFRP significantly increased the impact resistance of the damaged beams. CFRP increased the damaged beams stiffness and reduced the deflection. The results showed that the NSM technique is more effective in repairing beams and preventing the debonding of the CFRP.Keywords: damaged, concrete, impact, repaired
Procedia PDF Downloads 3451068 Bioethanol Production from Wild Sorghum (Sorghum arundinacieum) and Spear Grass (Heteropogon contortus)
Authors: Adeyinka Adesanya, Isaac Bamgboye
Abstract:
There is a growing need to develop the processes to produce renewable fuels and chemicals due to the economic, political, and environmental concerns associated with fossil fuels. Lignocellulosic biomass is an excellent renewable feedstock because it is both abundant and inexpensive. This project aims at producing bioethanol from lignocellulosic plants (Sorghum Arundinacieum and Heteropogon Contortus) by biochemical means, computing the energy audit of the process and determining the fuel properties of the produced ethanol. Acid pretreatment (0.5% H2SO4 solution) and enzymatic hydrolysis (using malted barley as enzyme source) were employed. The ethanol yield of wild sorghum was found to be 20% while that of spear grass was 15%. The fuel properties of the bioethanol from wild sorghum are 1.227 centipoise for viscosity, 1.10 g/cm3 for density, 0.90 for specific gravity, 78 °C for boiling point and the cloud point was found to be below -30 °C. That of spear grass was 1.206 centipoise for viscosity, 0.93 g/cm3 for density 1.08 specific gravity, 78 °C for boiling point and the cloud point was also found to be below -30 °C. The energy audit shows that about 64 % of the total energy was used up during pretreatment, while product recovery which was done manually demanded about 31 % of the total energy. Enzymatic hydrolysis, fermentation, and distillation total energy input were 1.95 %, 1.49 % and 1.04 % respectively, the alcoholometric strength of bioethanol from wild sorghum was found to be 47 % and the alcoholometric strength of bioethanol from spear grass was 72 %. Also, the energy efficiency of the bioethanol production for both grasses was 3.85 %.Keywords: lignocellulosic biomass, wild sorghum, spear grass, biochemical conversion
Procedia PDF Downloads 2361067 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy
Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay
Abstract:
The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management
Procedia PDF Downloads 3101066 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs
Authors: Wanessa G. Oliveira, Fernando C. Capovilla
Abstract:
Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology
Procedia PDF Downloads 3461065 Blend of Polyamide 6 with Polybutylene Terephthalate Compatibilized with Epoxidized Natural Rubber (ENR-25) and N Butyl Acrylate Glycidyl Methacrylate Ethylene (EBa-GMA)
Authors: Ramita Vongrat, Pornsri Sapsrithong, Manit Nithitanakul
Abstract:
In this work, blends of polyamide 6 (PA6) and polybutylene terephthalate (PBT) were successfully prepared. The effect of epoxidized natural rubber (ENR-25) and n butyl acrylate glycidyl methacrylate ethylene (EBa-GMA) as a compatibilizer on properties of PA6/PBT blends was also investigated by varying amount of ENR-50 and EBa-GMA, i.e., 0, 0.1, 0.5, 5 and 10 phr. All blends were prepared and shaped by using twin-screw extruder at 230 °C and injection molding machine, respectively. All test specimens were characterized by phase morphology, impact strength, tensile, flexural properties, and hardness. The results exhibited that phase morphology of PA6/PBT blend without compatibilizer was incompatible. This could be attributed to poor interfacial adhesion between the two polymers. SEM micrographs showed that the addition of ENR-25 and EBa-GMA improved the compatibility of PA6/PBT blends. With the addition of ENR-50 as a compatibilizer, the uniformity and the maximum reduction of dispersed phase size were observed. Additionally, the results indicate that, as the amount of ENR-25 increased, and EBa-GMA increased, the mechanical properties, including stress at the peak, tensile modulus, and izod impact strength, were also improved.Keywords: EBa-GMA, epoxidized natural rubber-25, polyamide 6, polybutylene terephthalate
Procedia PDF Downloads 1701064 Synthetic Aperture Radar Remote Sensing Classification Using the Bag of Visual Words Model to Land Cover Studies
Authors: Reza Mohammadi, Mahmod R. Sahebi, Mehrnoosh Omati, Milad Vahidi
Abstract:
Classification of high resolution polarimetric Synthetic Aperture Radar (PolSAR) images plays an important role in land cover and land use management. Recently, classification algorithms based on Bag of Visual Words (BOVW) model have attracted significant interest among scholars and researchers in and out of the field of remote sensing. In this paper, BOVW model with pixel based low-level features has been implemented to classify a subset of San Francisco bay PolSAR image, acquired by RADARSAR 2 in C-band. We have used segment-based decision-making strategy and compared the result with the result of traditional Support Vector Machine (SVM) classifier. 90.95% overall accuracy of the classification with the proposed algorithm has shown that the proposed algorithm is comparable with the state-of-the-art methods. In addition to increase in the classification accuracy, the proposed method has decreased undesirable speckle effect of SAR images.Keywords: Bag of Visual Words (BOVW), classification, feature extraction, land cover management, Polarimetric Synthetic Aperture Radar (PolSAR)
Procedia PDF Downloads 2131063 Application of Rapid Prototyping to Create Additive Prototype Using Computer System
Authors: Meftah O. Bashir, Fatma A. Karkory
Abstract:
Rapid prototyping is a new group of manufacturing processes, which allows fabrication of physical of any complexity using a layer by layer deposition technique directly from a computer system. The rapid prototyping process greatly reduces the time and cost necessary to bring a new product to market. The prototypes made by these systems are used in a range of industrial application including design evaluation, verification, testing, and as patterns for casting processes. These processes employ a variety of materials and mechanisms to build up the layers to build the part. The present work was to build a FDM prototyping machine that could control the X-Y motion and material deposition, to generate two-dimensional and three-dimensional complex shapes. This study focused on the deposition of wax material. This work was to find out the properties of the wax materials used in this work in order to enable better control of the FDM process. This study will look at the integration of a computer controlled electro-mechanical system with the traditional FDM additive prototyping process. The characteristics of the wax were also analysed in order to optimize the model production process. These included wax phase change temperature, wax viscosity and wax droplet shape during processing.Keywords: rapid prototyping, wax, manufacturing processes, shape
Procedia PDF Downloads 4661062 Mechanism of Sinkhole Development on Water-Bearing Soft Ground Tunneling
Authors: H. J. Kim, K. H. Kim, N. H. Park, K. T. Nam, Y. H. Jung, T. H. Kim, J. H. Shin
Abstract:
Underground excavations in an urban area can cause various geotechnical problems such as ground loss and lowering of groundwater level. When the ground loss becomes uncontrollably large, sinkholes can be developed to the ground surface. A sinkhole is commonly known as the natural phenomenon associated with lime rock areas. However, sinkholes in urban areas due to pressurized sewers and/or tunneling are also frequently reported. In this study, mechanism of a sinkhole developed at the site ‘A’ where a tunneling work underwent is investigated. The sinkhole occurred in the sand strata with the high level of groundwater when excavating a tunnel of which diameter is 3.6 m. The sinkhole was progressed in two steps. The first step began with the local failure around the tunnel face followed by tons of groundwater inflow, and the second step was triggered by the TBM (Tunnel Boring Machine) chamber opening which led to the progressive general failure. The possibility of the sinkhole was evaluated by using Limit Equilibrium Method (LEM), and critical height was evaluated by the empirical stability chart. It is found that the lowering of the face pressure and inflow of groundwater into the tunnel face turned to be the main reason for the sinkhole.Keywords: limit equilibrium method, sinkhole, stability chart, tunneling
Procedia PDF Downloads 2541061 Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal
Authors: Belayneh Matebie, Michael Melese
Abstract:
The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms.Keywords: Teff, ensemble learning, max-voting, CNN, SVM, RF
Procedia PDF Downloads 571060 Deep Learning and Accurate Performance Measure Processes for Cyber Attack Detection among Web Logs
Authors: Noureddine Mohtaram, Jeremy Patrix, Jerome Verny
Abstract:
As an enormous number of online services have been developed into web applications, security problems based on web applications are becoming more serious now. Most intrusion detection systems rely on each request to find the cyber-attack rather than on user behavior, and these systems can only protect web applications against known vulnerabilities rather than certain zero-day attacks. In order to detect new attacks, we analyze the HTTP protocols of web servers to divide them into two categories: normal attacks and malicious attacks. On the other hand, the quality of the results obtained by deep learning (DL) in various areas of big data has given an important motivation to apply it to cybersecurity. Deep learning for attack detection in cybersecurity has the potential to be a robust tool from small transformations to new attacks due to its capability to extract more high-level features. This research aims to take a new approach, deep learning to cybersecurity, to classify these two categories to eliminate attacks and protect web servers of the defense sector which encounters different web traffic compared to other sectors (such as e-commerce, web app, etc.). The result shows that by using a machine learning method, a higher accuracy rate, and a lower false alarm detection rate can be achieved.Keywords: anomaly detection, HTTP protocol, logs, cyber attack, deep learning
Procedia PDF Downloads 2131059 Application of Data Mining Techniques for Tourism Knowledge Discovery
Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee
Abstract:
Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.Keywords: classification algorithms, data mining, knowledge discovery, tourism
Procedia PDF Downloads 2951058 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory
Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi
Abstract:
One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm
Procedia PDF Downloads 4561057 Study of Heat Exchangers in Small Modular Reactors
Authors: Harish Aryal, Roger Hague, Daniel Sotelo, Felipe Astete Salinas
Abstract:
This paper presents a comparative study of different coolants, materials, and temperatures that can affect the effectiveness of heat exchangers that are used in small modular reactors. The corrugated plate heat exchangers were chosen out of different plate options for testing purposes because of their ease of access and better performance than other existing heat exchangers in recent years. SolidWorks enables us to see various results between water coolants and helium coolants acting upon different types of conducting metals, which were selected from different fluids that ultimately satisfied accessibility requirements and were compatible with the software. Though not every element, material, fluid, or method was used in the testing phase, their purpose is to help further research that is to come since the innovation of nuclear power is the future. The tests that were performed are to help better understand the constant necessities that are seen in heat exchangers and through every adjustment see what the breaking points or improvements in the machine are. Depending on consumers and researchers, the results may give further feedback as to show why different types of materials and fluids would be preferred and why it is necessary to keep failures to improve future research.Keywords: heat exchangers, Solidworks, coolants, small modular reactors, nuclear power, nanofluids, Nusselt number, friction factor, Reynolds number
Procedia PDF Downloads 76