Search results for: automated document processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4947

Search results for: automated document processing

1257 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection

Authors: Muhammad Ali

Abstract:

Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.

Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection

Procedia PDF Downloads 98
1256 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text

Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni

Abstract:

The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.

Keywords: cooccurrence graph, entity relation graph, unstructured text, weighted distance

Procedia PDF Downloads 129
1255 From Sound to Music: The Trajectory of Musical Semiotics in a Selected Soundscape Environment in South-Western Nigeria

Authors: Olatunbosun Samuel Adekogbe

Abstract:

This paper addresses the question of musical signification, revolving around nature and its natural divides; the paper tends to examine the roles of the dispositional apparatus of listeners to react to sounding environments through music as coordinated sound that focuses on the powerful strain between vibrational occurrences of sound and potentials of being structured. This paper sets out to examine music as a simple conventional design that does not allude to something beyond music and sound as a vehicle to communicate through production, perception, translation, and reaction with regard to melodic and semiotic functions of sounds. This paper adopts the application of questionnaire and evolutionary approach methods to probe musical adaptation, reproduction, and natural selection as the basis for explaining specific human behavioural responses to musical sense-making beyond the above-sketched dichotomies, with a major focus on the transition from acoustic-emotional sensibilities to musical meaning in the selected soundscapes. It was observed that music has emancipated itself from the level of mere acoustic processing of sounds to a functional description in terms of allowing music users to share experiences and interact with the soundscaping environment. The paper, therefore, concludes that the audience as music participants and listeners in the selected soundscapes have been conceived as adaptive devices in the paradigm shift, which can build up new semiotic linkages with the sounding environments in southwestern Nigeria.

Keywords: semiotics, sound, music, soundscape, environment

Procedia PDF Downloads 47
1254 Screening Deformed Red Blood Cells Irradiated by Ionizing Radiations Using Windowed Fourier Transform

Authors: Dahi Ghareab Abdelsalam Ibrahim, R. H. Bakr

Abstract:

Ionizing radiation, such as gamma radiation and X-rays, has many applications in medical diagnoses and cancer treatment. In this paper, we used the windowed Fourier transform to extract the complex image of the deformed red blood cells. The real values of the complex image are used to extract the best fitting of the deformed cell boundary. Male albino rats are irradiated by γ-rays from ⁶⁰Co. The male albino rats are anesthetized with ether, and then blood samples are collected from the eye vein by heparinized capillary tubes for studying the radiation-damaging effect in-vivo by the proposed windowed Fourier transform. The peripheral blood films are prepared according to the Brown method. The peripheral blood film is photographed by using an Automatic Image Contour Analysis system (SAMICA) from ELBEK-Bildanalyse GmbH, Siegen, Germany. The SAMICA system is provided with an electronic camera connected to a computer through a built-in interface card, and the image can be magnified up to 1200 times and displayed by the computer. The images of the peripheral blood films are then analyzed by the windowed Fourier transform method to extract the precise deformation from the best fitting. Based on accurate deformation evaluation of the red blood cells, diseases can be diagnosed in their primary stages.

Keywords: windowed Fourier transform, red blood cells, phase wrapping, Image processing

Procedia PDF Downloads 63
1253 Biotechonomy System Dynamics Modelling: Sustainability of Pellet Production

Authors: Andra Blumberga, Armands Gravelsins, Haralds Vigants, Dagnija Blumberga

Abstract:

The paper discovers biotechonomy development analysis by use of system dynamics modelling. The research is connected with investigations of biomass application for production of bioproducts with higher added value. The most popular bioresource is wood, and therefore, the main question today is about future development and eco-design of products. The paper emphasizes and evaluates energy sector which is open for use of wood logs, wood chips, wood pellets and so on. The main aim for this research study was to build a framework to analyse development perspectives for wood pellet production. To reach the goal, a system dynamics model of energy wood supplies, processing, and consumption is built. Production capacity, energy consumption, changes in energy and technology efficiency, required labour source, prices of wood, energy and labour are taken into account. Validation and verification tests with available data and information have been carried out and indicate that the model constitutes the dynamic hypothesis. It is found that the more is invested into pellets production, the higher the specific profit per production unit compared to wood logs and wood chips. As a result, wood chips production is decreasing dramatically and is replaced by wood pellets. The limiting factor for pellet industry growth is availability of wood sources. This is governed by felling limit set by the government based on sustainable forestry principles.

Keywords: bioenergy, biotechonomy, system dynamics modelling, wood pellets

Procedia PDF Downloads 378
1252 Use of Polymeric Materials in the Architectural Preservation

Authors: F. Z. Benabid, F. Zouai, A. Douibi, D. Benachour

Abstract:

These Fluorinated polymers and polyacrylics have known a wide use in the field of historical monuments. PVDF provides a great easiness to processing, a good UV resistance and good chemical inertia. Although the quality of physical characteristics of the PMMA and its low price with a respect to PVDF, its deterioration against UV radiations limits its use as protector agent for the stones. On the other hand, PVDF/PMMA blend is a compromise of a great development in the field of architectural restoration, since it is the best method in term of quality and price to make new polymeric materials having enhanced properties. Films of different compositions based on the two polymers within an adequate solvent (DMF) were obtained to perform an exposition to artificial ageing and to the salted fog, a spectroscopic analysis (FTIR and UV) and optical analysis (refractive index). Based on its great interest in the field of building, a variety of standard tests has been elaborated for the first time at the central laboratory of ENAP (Souk-Ahras) in order to evaluate our blend performance. The obtained results have allowed observing the behavior of the different compositions of the blend under various tests. The addition of PVDF to PMMA enhances the properties of this last to know the exhibition to the natural and artificial ageing and to the saline fog. On the other hand, PMMA enhances the optical properties of the blend. Finally, 70/30 composition of the blend is in concordance with results of previous works and it is the adequate proportion for an eventual application.

Keywords: blend, PVDF, PMMA, preservation, historic monuments

Procedia PDF Downloads 289
1251 Microstructure and Tribological Properties of AlSi5Cu2/SiC Composite

Authors: Magdalena Suśniak, Joanna Karwan-Baczewska

Abstract:

Microstructure and tribological properties of AlSi5Cu2 matrix composite reinforced with SiC have been studied by microscopic examination and basic tribological properties. Composite material was produced by the mechanical alloying and spark plasma sintering (SPS) technique. The mixture of AlSi5Cu2 chips with 0, 10, 15 wt. % of SiC powder were placed in 250 ml mixing jar and milled 40 hours. To prevent the extreme cold welding the 1 wt. % of stearic acid was added to the powder mixture as a process control agent. Mechanical alloying provide to obtain composites powder with uniform distribution of SiC in matrix. Composite powders were poured into a graphite and a pulsed electric current was passed through powder under vacuum to consolidate material. Processing conditions were: sintering temperature 450°C, uniaxial pressure 32MPa, time of sintering 5 minutes. After SPS process composite samples indicate higher hardness values, lower weight loss, and lower coefficient of friction as compared with the unreinforced alloy. Light microscope micrograph of the worn surfaces and wear debris revealed that in the unreinforced alloy the prominent wear mechanism was the adhesive wear. In the AlSi5Cu2/SiC composites, by increasing of SiC the wear mechanism changed from adhesive and micro-cutting to abrasive and delamination for composite with 20 SiC wt. %. In all the AlSi5Cu2/SiC composites, abrasive wear was the main wear mechanism.

Keywords: aluminum matrix composite, mechanical alloying, spark plasma sintering, AlSi5Cu2/SiC composite

Procedia PDF Downloads 369
1250 Altered Network Organization in Mild Alzheimer's Disease Compared to Mild Cognitive Impairment Using Resting-State EEG

Authors: Chia-Feng Lu, Yuh-Jen Wang, Shin Teng, Yu-Te Wu, Sui-Hing Yan

Abstract:

Brain functional networks based on resting-state EEG data were compared between patients with mild Alzheimer’s disease (mAD) and matched patients with amnestic subtype of mild cognitive impairment (aMCI). We integrated the time–frequency cross mutual information (TFCMI) method to estimate the EEG functional connectivity between cortical regions and the network analysis based on graph theory to further investigate the alterations of functional networks in mAD compared with aMCI group. We aimed at investigating the changes of network integrity, local clustering, information processing efficiency, and fault tolerance in mAD brain networks for different frequency bands based on several topological properties, including degree, strength, clustering coefficient, shortest path length, and efficiency. Results showed that the disruptions of network integrity and reductions of network efficiency in mAD characterized by lower degree, decreased clustering coefficient, higher shortest path length, and reduced global and local efficiencies in the delta, theta, beta2, and gamma bands were evident. The significant changes in network organization can be used in assisting discrimination of mAD from aMCI in clinical.

Keywords: EEG, functional connectivity, graph theory, TFCMI

Procedia PDF Downloads 417
1249 Concentrations of Some Metallic Trace Elements in Twelve Sludge Incineration Ashes

Authors: Lotfi Khiari, Antoine Karam, Claude-Alla Joseph, Marc Hébert

Abstract:

The main objective of incineration of sludge generated from municipal or agri-food waste treatment plant is to reduce the volume of sludge to be disposed of as a solid or liquid waste, whilst concentrating or destroying potentially harmful volatile substances. In some cities in Canada and United States of America (USA), a large amount of sludge is incinerated, which entails a loss of organic matter and water leading to phosphorus, potassium and some metallic trace element (MTE) accumulation in ashes. The purpose of this study was to evaluate the concentration of potentially hazardous MTE such as cadmium (Cd), lead (Pb) and mercury (Hg) in twelve sludge incineration ash samples obtained from municipal wastewater and other food processing waste treatments from Canada and USA. The average, maximum, and minimum values of MTE in ashes were calculated for each city individually and all together. The trace metal concentration values were compared to the literature reported values. The concentrations of MTE in ashes vary widely depending on the sludge origins and treatment options. The concentrations of MTE in ashes were found the range of 0.1-6.4 mg/kg for Cd; 13-286 mg/kg for Pb and 0.1-0.5 mg/kg for Hg. On average, the following order of metal concentration in ashes was observed: Pb > Cd > Hg. Results show that metal contents in most ashes were similar to MTE levels in synthetic inorganic fertilizers and many fertilizing residual materials. Consequently, the environmental effects of MTE content of these ashes would be low.

Keywords: biosolids, heavy metals, recycling, sewage sludge

Procedia PDF Downloads 357
1248 Investigations of the Crude Oil Distillation Preheat Section in Unit 100 of Abadan Refinery and Its Recommendation

Authors: Mahdi GoharRokhi, Mohammad H. Ruhipour, Mohammad R. ZamaniZadeh, Mohsen Maleki, Yusef Shamsayi, Mahdi FarhaniNejad, Farzad FarrokhZadeh

Abstract:

Possessing massive resources of natural gas and petroleum, Iran has a special place among all other oil producing countries, according to international institutions of energy. In order to use these resources, development and functioning optimization of refineries and industrial units is mandatory. Heat exchanger is one of the most important and strategic equipment which its key role in the process of production is clear to everyone. For instance, if the temperature of a processing fluid is not set as needed by heat exchangers, the specifications of desired product can change profoundly. Crude oil enters a network of heat exchangers in atmospheric distillation section before getting into the distillation tower; in this case, well-functioning of heat exchangers can significantly affect the operation of distillation tower. In this paper, different scenarios for pre-heating of oil are studied using oil and gas simulation software, and the results are discussed. As we reviewed various scenarios, adding a heat exchanger to pre-heating network is proposed as the most efficient factor in improving all governing parameters of the tower i.e. temperature, pressure, and reflux rate. This exchanger is embedded in crude oil’s path. Crude oil enters the exchanger after E-101 and exchanges heat with discharging kerosene pump around from E-136. As depicted in the results, it will efficiently assist the improvement of process operation and side expenses.

Keywords: atmospheric distillation unit, heat exchanger, preheat, simulation

Procedia PDF Downloads 638
1247 Geographic Information System Based Multi-Criteria Subsea Pipeline Route Optimisation

Authors: James Brown, Stella Kortekaas, Ian Finnie, George Zhang, Christine Devine, Neil Healy

Abstract:

The use of GIS as an analysis tool for engineering decision making is now best practice in the offshore industry. GIS enables multidisciplinary data integration, analysis and visualisation which allows the presentation of large and intricate datasets in a simple map-interface accessible to all project stakeholders. Presenting integrated geoscience and geotechnical data in GIS enables decision makers to be well-informed. This paper is a successful case study of how GIS spatial analysis techniques were applied to help select the most favourable pipeline route. Routing a pipeline through any natural environment has numerous obstacles, whether they be topographical, geological, engineering or financial. Where the pipeline is subjected to external hydrostatic water pressure and is carrying pressurised hydrocarbons, the requirement to safely route the pipeline through hazardous terrain becomes absolutely paramount. This study illustrates how the application of modern, GIS-based pipeline routing techniques enabled the identification of a single most-favourable pipeline route crossing of a challenging seabed terrain. Conventional approaches to pipeline route determination focus on manual avoidance of primary constraints whilst endeavouring to minimise route length. Such an approach is qualitative, subjective and is liable to bias towards the discipline and expertise that is involved in the routing process. For very short routes traversing benign seabed topography in shallow water this approach may be sufficient, but for deepwater geohazardous sites, the need for an automated, multi-criteria, and quantitative approach is essential. This study combined multiple routing constraints using modern least-cost-routing algorithms deployed in GIS, hitherto unachievable with conventional approaches. The least-cost-routing procedure begins with the assignment of geocost across the study area. Geocost is defined as a numerical penalty score representing hazard posed by each routing constraint (e.g. slope angle, rugosity, vulnerability to debris flows) to the pipeline. All geocosted routing constraints are combined to generate a composite geocost map that is used to compute the least geocost route between two defined terminals. The analyses were applied to select the most favourable pipeline route for a potential gas development in deep water. The study area is geologically complex with a series of incised, potentially active, canyons carved into a steep escarpment, with evidence of extensive debris flows. A similar debris flow in the future could cause significant damage to a poorly-placed pipeline. Protruding inter-canyon spurs offer lower-gradient options for ascending an escarpment but the vulnerability of periodic failure of these spurs is not well understood. Close collaboration between geoscientists, pipeline engineers, geotechnical engineers and of course the gas export pipeline operator guided the analyses and assignment of geocosts. Shorter route length, less severe slope angles, and geohazard avoidance were the primary drivers in identifying the most favourable route.

Keywords: geocost, geohazard, pipeline route determination, pipeline route optimisation, spatial analysis

Procedia PDF Downloads 382
1246 Attention Deficit Disorders (ADD) among Stressed Pre-NCE Students in Federal College of Education, Kano-Nigeria

Authors: A. S. Haruna, M. L. Mayanchi

Abstract:

Pre Nigeria Certificate in Education otherwise called Pre-NCE is an intensive two semester course designed to assist candidates who could not meet the requirements for admission into NCE programme. The task of coping with the stressors in the course can interfere with the students’ ability to regulate attention skills and stay organized. The main objectives of the study were to find out the prevalence of stress; determine the association between stress and ADD and reveal gender difference in the prevalence of ADD among stressed pre-NCE students. Cross–Sectional Correlation Design was employed in which 333 (Male=65%; Female=35%) students were proportionately sampled and administered Stress Assessment Scale [SAS r=0.74) and those identified with stress were thereafter rated with Cognitive Processing Inventory [CPI]. Data collected was used to analyze the three null hypotheses through One-sample Kolmogorov-Smirnov (K-S) Z-score, Pearson Product Moment Correlation Coefficients (PPMCC) and t-test statistics respectively at 0.05 confidence level. Results revealed significant prevalence of stress [Z-calculated =2.24; Z-critical = ±1.96], and a positive relationship between Stress and ADD among Pre-NCE students [r-calculated =0.450; r-critical =0.138]. However, there was no gender difference in the prevalence of ADD among stressed Pre-NCE students in the college [t-calculated =1.49; t-critical =1.645]. The study concludes that while stress and ADD prevail among pre-NCE students, there was no gender difference in the prevalence of ADD. Recommendations offered suggest the use of Learners Assistance Programs (LAP) for stress management, and Teacher-Students ratio of 1:25 be adopted in order to cater for stressed pre-NCE students with ADD.

Keywords: attention deficit disorder, pre-NCE students, stress, Pearson Product Moment Correlation Coefficients (PPMCC)

Procedia PDF Downloads 225
1245 Machine Learning and Deep Learning Approach for People Recognition and Tracking in Crowd for Safety Monitoring

Authors: A. Degale Desta, Cheng Jian

Abstract:

Deep learning application in computer vision is rapidly advancing, giving it the ability to monitor the public and quickly identify potentially anomalous behaviour from crowd scenes. Therefore, the purpose of the current work is to improve the performance of safety of people in crowd events from panic behaviour through introducing the innovative idea of Aggregation of Ensembles (AOE), which makes use of the pre-trained ConvNets and a pool of classifiers to find anomalies in video data with packed scenes. According to the theory of algorithms that applied K-means, KNN, CNN, SVD, and Faster-CNN, YOLOv5 architectures learn different levels of semantic representation from crowd videos; the proposed approach leverages an ensemble of various fine-tuned convolutional neural networks (CNN), allowing for the extraction of enriched feature sets. In addition to the above algorithms, a long short-term memory neural network to forecast future feature values and a handmade feature that takes into consideration the peculiarities of the crowd to understand human behavior. On well-known datasets of panic situations, experiments are run to assess the effectiveness and precision of the suggested method. Results reveal that, compared to state-of-the-art methodologies, the system produces better and more promising results in terms of accuracy and processing speed.

Keywords: action recognition, computer vision, crowd detecting and tracking, deep learning

Procedia PDF Downloads 139
1244 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 56
1243 Artificial Neural Network to Predict the Optimum Performance of Air Conditioners under Environmental Conditions in Saudi Arabia

Authors: Amr Sadek, Abdelrahaman Al-Qahtany, Turkey Salem Al-Qahtany

Abstract:

In this study, a backpropagation artificial neural network (ANN) model has been used to predict the cooling and heating capacities of air conditioners (AC) under different conditions. Sufficiently large measurement results were obtained from the national energy-efficiency laboratories in Saudi Arabia and were used for the learning process of the ANN model. The parameters affecting the performance of the AC, including temperature, humidity level, specific heat enthalpy indoors and outdoors, and the air volume flow rate of indoor units, have been considered. These parameters were used as inputs for the ANN model, while the cooling and heating capacity values were set as the targets. A backpropagation ANN model with two hidden layers and one output layer could successfully correlate the input parameters with the targets. The characteristics of the ANN model including the input-processing, transfer, neurons-distance, topology, and training functions have been discussed. The performance of the ANN model was monitored over the training epochs and assessed using the mean squared error function. The model was then used to predict the performance of the AC under conditions that were not included in the measurement results. The optimum performance of the AC was also predicted under the different environmental conditions in Saudi Arabia. The uncertainty of the ANN model predictions has been evaluated taking into account the randomness of the data and lack of learning.

Keywords: artificial neural network, uncertainty of model predictions, efficiency of air conditioners, cooling and heating capacities

Procedia PDF Downloads 48
1242 Pineapple Waste Valorization through Biogas Production: Effect of Substrate Concentration and Microwave Pretreatment

Authors: Khamdan Cahyari, Pratikno Hidayat

Abstract:

Indonesia has produced more than 1.8 million ton pineapple fruit in 2013 of which turned into waste due to industrial processing, deterioration and low qualities. It was estimated that this waste accounted for more than 40 percent of harvested fruits. In addition, pineapple leaves were one of biomass waste from pineapple farming land, which contributed even higher percentages. Most of the waste was only dumped into landfill area without proper pretreatment causing severe environmental problem. This research was meant to valorize the pineapple waste for producing renewable energy source of biogas through mesophilic (30℃) anaerobic digestion process. Especially, it was aimed to investigate effect of substrate concentration of pineapple fruit waste i.e. peel, core as well as effect of microwave pretreatment of pineapple leaves waste. The concentration of substrate was set at value 12, 24 and 36 g VS/liter culture whereas 800-Watt microwave pretreatment conducted at 2 and 5 minutes. It was noticed that optimum biogas production obtained at concentration 24 g VS/l with biogas yield 0.649 liter/g VS (45%v CH4) whereas microwave pretreatment at 2 minutes duration performed better compare to 5 minutes due to shorter exposure of microwave heat. This results suggested that valorization of pineapple waste could be carried out through biogas production at the aforementioned process condition. Application of this method is able to both reduce the environmental problem of the waste and produce renewable energy source of biogas to fulfill local energy demand of pineapple farming areas.

Keywords: pineapple waste, substrate concentration, microwave pretreatment, biogas, anaerobic digestion

Procedia PDF Downloads 555
1241 Reconfigurable Device for 3D Visualization of Three Dimensional Surfaces

Authors: Robson da C. Santos, Carlos Henrique de A. S. P. Coutinho, Lucas Moreira Dias, Gerson Gomes Cunha

Abstract:

The article refers to the development of an augmented reality 3D display, through the control of servo motors and projection of image with aid of video projector on the model. Augmented Reality is a branch that explores multiple approaches to increase real-world view by viewing additional information along with the real scene. The article presents the broad use of electrical, electronic, mechanical and industrial automation for geospatial visualizations, applications in mathematical models with the visualization of functions and 3D surface graphics and volumetric rendering that are currently seen in 2D layers. Application as a 3D display for representation and visualization of Digital Terrain Model (DTM) and Digital Surface Models (DSM), where it can be applied in the identification of canyons in the marine area of the Campos Basin, Rio de Janeiro, Brazil. The same can execute visualization of regions subject to landslides, as in Serra do Mar - Agra dos Reis and Serranas cities both in the State of Rio de Janeiro. From the foregoing, loss of human life and leakage of oil from pipelines buried in these regions may be anticipated in advance. The physical design consists of a table consisting of a 9 x 16 matrix of servo motors, totalizing 144 servos, a mesh is used on the servo motors for visualization of the models projected by a retro projector. Each model for by an image pre-processing, is sent to a server to be converted and viewed from a software developed in C # Programming Language.

Keywords: visualization, 3D models, servo motors, C# programming language

Procedia PDF Downloads 322
1240 PathoPy2.0: Application of Fractal Geometry for Early Detection and Histopathological Analysis of Lung Cancer

Authors: Rhea Kapoor

Abstract:

Fractal dimension provides a way to characterize non-geometric shapes like those found in nature. The purpose of this research is to estimate Minkowski fractal dimension of human lung images for early detection of lung cancer. Lung cancer is the leading cause of death among all types of cancer and an early histopathological analysis will help reduce deaths primarily due to late diagnosis. A Python application program, PathoPy2.0, was developed for analyzing medical images in pixelated format and estimating Minkowski fractal dimension using a new box-counting algorithm that allows windowing of images for more accurate calculation in the suspected areas of cancerous growth. Benchmark geometric fractals were used to validate the accuracy of the program and changes in fractal dimension of lung images to indicate the presence of issues in the lung. The accuracy of the program for the benchmark examples was between 93-99% of known values of the fractal dimensions. Fractal dimension values were then calculated for lung images, from National Cancer Institute, taken over time to correctly detect the presence of cancerous growth. For example, as the fractal dimension for a given lung increased from 1.19 to 1.27 due to cancerous growth, it represents a significant change in fractal dimension which lies between 1 and 2 for 2-D images. Based on the results obtained on many lung test cases, it was concluded that fractal dimension of human lungs can be used to diagnose lung cancer early. The ideas behind PathoPy2.0 can also be applied to study patterns in the electrical activity of the human brain and DNA matching.

Keywords: fractals, histopathological analysis, image processing, lung cancer, Minkowski dimension

Procedia PDF Downloads 152
1239 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 110
1238 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle

Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar

Abstract:

As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the Central Processing Unit (CPU), operational (RAM), and permanent (ROM) memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.

Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles

Procedia PDF Downloads 87
1237 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

Authors: Jianwei Ma, Diriba Gemechu

Abstract:

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Keywords: anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, split Bregman algorithm

Procedia PDF Downloads 192
1236 Anti-Nutritional Factors, In-Vitro Trypsin, Chymotrypsin and Peptidase Multi Enzyme Protein Digestibility of Some Melon (Egusi) Seeds and Their Protein Isolates

Authors: Joan O. Ogundele, Aladesanmi A. Oshodi, Adekunle I. Amoo

Abstract:

Abstract In-vitro multi-enzyme protein digestibility (IVMPD) and some anti-nutritional factors (ANF) of five melon (egusi) seed flours (MSF) and their protein isolates (PI) were carried out. Their PI have potentials comparable to that of soya beans. It is important to know the IVMPD and ANF of these protein sources as to ensure their safety when adapted for use as alternate protein sources to substitute for cow milk, which is relatively expensive in Nigeria. Standard methods were used to produce PI of Citrullus colocynthis, Citrullus vulgaris, African Wine Kettle gourd (Lageneria siceraria I), Basket Ball gourd (Lagenaria siceraria II) and Bushel Giant Gourd (Lageneria siceraria III) seeds and to determine the ANF and IVMPD of the MSF and PI unheated and at 37oC. Multi-enzymes used were trypsin, chymotrypsin and peptidase. IVMPD of MSF ranged from (70.67±0.70) % (C. vulgaris) to (72.07± 1.79) % (L.siceraria I) while for their PI ranged from 74.33% (C.vulgaris) to 77.55% (L.siceraria III). IVMPD of the PI were higher than those of MSF. Heating increased IVMPD of MSF with average value of 79.40% and those of PI with average of 84.14%. ANF average in MSF are tannin (0.11mg/g), phytate (0.23%). Differences in IVMPD of MSF and their PI at different temperatures may arise from processing conditions that alter the release of amino acids from proteins by enzymatic processes. ANF in MSF were relatively low, but were found to be lower in the PI, therefor making the PI safer for human consumption as an alternate source of protein.

Keywords: Anti-nutrients, Enzymatic protein digestibility, Melon (egusi)., Protein Isolates.

Procedia PDF Downloads 91
1235 Treatment of Leather Industry Wastewater with Advance Treatment Methods

Authors: Seval Yilmaz, Filiz Bayrakci Karel, Ali Savas Koparal

Abstract:

Textile products produced by leather have been indispensable for human consumption. Various chemicals are used to enhance the durability of end-products in the processing of leather products. The wastewaters from the leather industry which contain these chemicals exhibit toxic effects on the receiving environment and threaten the natural ecosystem. In this study, leather industry wastewater (LIW), which has high loads of contaminants, was treated using advanced treatment techniques instead of conventional methods. During the experiments, the performance of electrochemical methods was investigated. During the electrochemical experiments, the performance of batch electrooxidation (EO) using boron-doped diamond (BDD) electrodes with monopolar configuration for removal of chemical oxygen demand (COD) from LIW were investigated. The influences of electrolysis time, current density (which varies as 5 mA/cm², 10 mA/cm², 20 mA/cm², 30 mA/cm², 50 mA/cm²) and initial pH (which varies as 3,80 (natural pH of LIW), 7, 9) on removal efficiency were investigated in a batch stirred cell to determine the best treatment conditions. The current density applied to the electrochemical reactors is directly proportional to the consumption of electric energy, so electrical energy consumption was monitored during the experiment. The best experimental conditions obtained in electrochemical studies were as follows: electrolysis time = 60 min, current density = 30.0 mA/cm², pH 7. Using these parameters, 53.59% COD removal rates for LIW was achieved and total energy consumption was obtained as 13.03 kWh/m³. It is concluded that electrooxidation process constitutes a plausible and developable method for the treatment of LIW.

Keywords: BDD electrodes, COD removal, electrochemical treatment, leather industry wastewater

Procedia PDF Downloads 143
1234 Electrodeposition and Selenization of Cuin Alloys for the Synthesis of Photoactive Cu2in1-X Gax Se2 (Cigs) Thin Films

Authors: Mohamed Benaicha, Mahdi Allam

Abstract:

A new two stage electrochemical process as a safe, large area and low processing cost technique for the production of semi-conducting CuInSe2 (CIS) thin films is studied. CuIn precursors were first potentiostatically electrodeposited onto molybdenum substrates from an acidic thiocyanate electrolyte. In a second stage, the prepared metallic CuIn layers were used as substrate in the selenium electrochemical deposition system and subjected to a thermal treatment in vacuum atmosphere, to eliminate binary phase formation by reaction of the Cu2-x Se and InxSey selenides, leading to the formation of CuInSe2 thin film. Electrochemical selenization from aqueous electrolyte is introduced as an alternative to toxic and hazardous H2Se or Se vapor phase selenization used in physical techniques. In this study, the influence of film deposition parameters such as bath composition, temperature and potential on film properties was studied. The electrochemical, morphological, structural and compositional properties of electrodeposited thin films were characterized using various techniques. Results of Cyclic and Stripping-Cyclic Voltammetry (CV, SCV), Scanning Electron Microscopy (SEM) and Energy Dispersive X-Ray microanalysis (EDX) investigations revealed good reproducibility and homogeneity of the film composition. Thereby optimal technological parameters for the electrochemical production of CuIn, Se as precursors for CuInSe2 thin layers are determined.

Keywords: photovoltaic, CIGS, copper alloys, electrodeposition, thin films

Procedia PDF Downloads 443
1233 Information Pollution: Exploratory Analysis of Subs-Saharan African Media’s Capabilities to Combat Misinformation and Disinformation

Authors: Muhammed Jamiu Mustapha, Jamiu Folarin, Stephen Obiri Agyei, Rasheed Ademola Adebiyi, Mutiu Iyanda Lasisi

Abstract:

The role of information in societal development and growth cannot be over-emphasized. It has remained an age-long strategy to adopt the information flow to make an egalitarian society. The same has become a tool for throwing society into chaos and anarchy. It has been adopted as a weapon of war and a veritable instrument of psychological warfare with a variety of uses. That is why some scholars posit that information could be deployed as a weapon to wreak “Mass Destruction" or promote “Mass Development". When used as a tool for destruction, the effect on society is like an atomic bomb which when it is released, pollutes the air and suffocates the people. Technological advancement has further exposed the latent power of information and many societies seem to be overwhelmed by its negative effect. While information remains one of the bedrock of democracy, the information ecosystem across the world is currently facing a more difficult battle than ever before due to information pluralism and technological advancement. The more the agents involved try to combat its menace, the difficult and complex it is proving to be curbed. In a region like Africa with dangling democracy enfolds with complexities of multi-religion, multi-cultures, inter-tribes, ongoing issues that are yet to be resolved, it is important to pay critical attention to the case of information disorder and find appropriate ways to curb or mitigate its effects. The media, being the middleman in the distribution of information, needs to build capacities and capabilities to separate the whiff of misinformation and disinformation from the grains of truthful data. From quasi-statistical senses, it has been observed that the efforts aimed at fighting information pollution have not considered the built resilience of media organisations against this disorder. Apparently, the efforts, resources and technologies adopted for the conception, production and spread of information pollution are much more sophisticated than approaches to suppress and even reduce its effects on society. Thus, this study seeks to interrogate the phenomenon of information pollution and the capabilities of select media organisations in Sub-Saharan Africa. In doing this, the following questions are probed; what are the media actions to curb the menace of information pollution? Which of these actions are working and how effective are they? And which of the actions are not working and why they are not working? Adopting quantitative and qualitative approaches and anchored on the Dynamic Capability Theory, the study aims at digging up insights to further understand the complexities of information pollution, media capabilities and strategic resources for managing misinformation and disinformation in the region. The quantitative approach involves surveys and the use of questionnaires to get data from journalists on their understanding of misinformation/disinformation and their capabilities to gate-keep. Case Analysis of select media and content analysis of their strategic resources to manage misinformation and disinformation is adopted in the study while the qualitative approach will involve an In-depth Interview to have a more robust analysis is also considered. The study is critical in the fight against information pollution for a number of reasons. One, it is a novel attempt to document the level of media capabilities to fight the phenomenon of information disorder. Two, the study will enable the region to have a clear understanding of the capabilities of existing media organizations to combat misinformation and disinformation in the countries that make up the region. Recommendations emanating from the study could be used to initiate, intensify or review existing approaches to combat the menace of information pollution in the region.

Keywords: disinformation, information pollution, misinformation, media capabilities, sub-Saharan Africa

Procedia PDF Downloads 147
1232 Ferulic Acid-Grafted Chitosan: Thermal Stability and Feasibility as an Antioxidant for Active Biodegradable Packaging Film

Authors: Sarekha Woranuch, Rangrong Yoksan

Abstract:

Active packaging has been developed based on the incorporation of certain additives, in particular antimicrobial and antioxidant agents, into packaging systems to maintain or extend product quality and shelf-life. Ferulic acid is one of the most effective natural phenolic antioxidants, which has been used in food, pharmaceutical and active packaging film applications. However, most phenolic compounds are sensitive to oxygen, light and heat; its activities are thus lost during product formulation and processing. Grafting ferulic acid onto polymer is an alternative to reduce its loss under thermal processes. Therefore, the objectives of the present research were to study the thermal stability of ferulic acid after grafting onto chitosan, and to investigate the possibility of using ferulic acid-grafted chitosan (FA-g-CTS) as an antioxidant for active biodegradable packaging film. FA-g-CTS was incorporated into biodegradable film via a two-step process, i.e. compounding extrusion at temperature up to 150 °C followed by blown film extrusion at temperature up to 175 °C. Although incorporating FA-g-CTS with a content of 0.02–0.16% (w/w) caused decreased water vapor barrier property and reduced extensibility, the films showed improved oxygen barrier property and antioxidant activity. Radical scavenging activity and reducing power of the film containing FA-g-CTS with a content of 0.04% (w/w) were higher than that of the naked film about 254% and 94%, respectively. Tensile strength and rigidity of the films were not significantly affected by adding FA-g-CTS with a content of 0.02–0.08% (w/w). The results indicated that FA-g-CTS could be potentially used as an antioxidant for active packaging film.

Keywords: active packaging film, antioxidant activity, chitosan, ferulic acid

Procedia PDF Downloads 486
1231 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification

Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro

Abstract:

Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.

Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification

Procedia PDF Downloads 93
1230 Spatial Information and Urbanizing Futures

Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini

Abstract:

Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.

Keywords: PPGIS, spatial information, urbanizing futures, urban planning

Procedia PDF Downloads 707
1229 Reliability-Centered Maintenance Application for the Development of Maintenance Strategy for a Cement Plant

Authors: Nabil Hameed Al-Farsi

Abstract:

This study’s main goal is to develop a model and a maintenance strategy for a cement factory called Arabian Cement Company, Rabigh Plant. The proposed work here depends on Reliability centric maintenance approach to develop a strategy and maintenance schedule that ensures increasing the reliability of the production system components, thus ensuring continuous productivity. The cost-effective maintenance of the plant’s dependability performance is the key goal of durability-based maintenance is. The cement plant consists of 7 important steps, so, developing a maintenance plan based on Reliability centric maintenance (RCM) method is made up of 10 steps accordingly starting from selecting units and data until performing and updating the model. The processing unit chosen for the analysis of this case is the calcinatory unit regarding model’s validation and the Travancore Titanium Products Ltd (TTP) using the claimed data history acquired from the maintenance department maintenance from the mentioned company. After applying the proposed model, the results of the maintenance simulation justified the plant's existing scheduled maintenance policy being reconsidered. Results represent the need for preventive maintenance for all Class A criticality equipment instead of the planned maintenance and the breakdown one for all other equipment depends on its criticality and an FMEA report. Consequently, the additional cost of preventive maintenance would be offset by the cost savings from breakdown maintenance for the remaining equipment.

Keywords: engineering, reliability, strategy, maintenance, failure modes, effects and criticality analysis (FMEA)

Procedia PDF Downloads 147
1228 A BERT-Based Model for Financial Social Media Sentiment Analysis

Authors: Josiel Delgadillo, Johnson Kinyua, Charles Mutigwe

Abstract:

The purpose of sentiment analysis is to determine the sentiment strength (e.g., positive, negative, neutral) from a textual source for good decision-making. Natural language processing in domains such as financial markets requires knowledge of domain ontology, and pre-trained language models, such as BERT, have made significant breakthroughs in various NLP tasks by training on large-scale un-labeled generic corpora such as Wikipedia. However, sentiment analysis is a strong domain-dependent task. The rapid growth of social media has given users a platform to share their experiences and views about products, services, and processes, including financial markets. StockTwits and Twitter are social networks that allow the public to express their sentiments in real time. Hence, leveraging the success of unsupervised pre-training and a large amount of financial text available on social media platforms could potentially benefit a wide range of financial applications. This work is focused on sentiment analysis using social media text on platforms such as StockTwits and Twitter. To meet this need, SkyBERT, a domain-specific language model pre-trained and fine-tuned on financial corpora, has been developed. The results show that SkyBERT outperforms current state-of-the-art models in financial sentiment analysis. Extensive experimental results demonstrate the effectiveness and robustness of SkyBERT.

Keywords: BERT, financial markets, Twitter, sentiment analysis

Procedia PDF Downloads 132