Search results for: real time data processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40599

Search results for: real time data processing

39879 Crop Classification using Unmanned Aerial Vehicle Images

Authors: Iqra Yaseen

Abstract:

One of the well-known areas of computer science and engineering, image processing in the context of computer vision has been essential to automation. In remote sensing, medical science, and many other fields, it has made it easier to uncover previously undiscovered facts. Grading of diverse items is now possible because of neural network algorithms, categorization, and digital image processing. Its use in the classification of agricultural products, particularly in the grading of seeds or grains and their cultivars, is widely recognized. A grading and sorting system enables the preservation of time, consistency, and uniformity. Global population growth has led to an increase in demand for food staples, biofuel, and other agricultural products. To meet this demand, available resources must be used and managed more effectively. Image processing is rapidly growing in the field of agriculture. Many applications have been developed using this approach for crop identification and classification, land and disease detection and for measuring other parameters of crop. Vegetation localization is the base of performing these task. Vegetation helps to identify the area where the crop is present. The productivity of the agriculture industry can be increased via image processing that is based upon Unmanned Aerial Vehicle photography and satellite. In this paper we use the machine learning techniques like Convolutional Neural Network, deep learning, image processing, classification, You Only Live Once to UAV imaging dataset to divide the crop into distinct groups and choose the best way to use it.

Keywords: image processing, UAV, YOLO, CNN, deep learning, classification

Procedia PDF Downloads 107
39878 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment

Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay

Abstract:

Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.

Keywords: machine learning, system performance, performance metrics, IoT, edge

Procedia PDF Downloads 195
39877 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 64
39876 Topographic Mapping of Farmland by Integration of Multiple Sensors on Board Low-Altitude Unmanned Aerial System

Authors: Mengmeng Du, Noboru Noguchi, Hiroshi Okamoto, Noriko Kobayashi

Abstract:

This paper introduced a topographic mapping system with time-saving and simplicity advantages based on integration of Light Detection and Ranging (LiDAR) data and Post Processing Kinematic Global Positioning System (PPK GPS) data. This topographic mapping system used a low-altitude Unmanned Aerial Vehicle (UAV) as a platform to conduct land survey in a low-cost, efficient, and totally autonomous manner. An experiment in a small-scale sugarcane farmland was conducted in Queensland, Australia. Subsequently, we synchronized LiDAR distance measurements that were corrected by using attitude information from gyroscope with PPK GPS coordinates for generation of precision topographic maps, which could be further utilized for such applications like precise land leveling and drainage management. The results indicated that LiDAR distance measurements and PPK GPS altitude reached good accuracy of less than 0.015 m.

Keywords: land survey, light detection and ranging, post processing kinematic global positioning system, precision agriculture, topographic map, unmanned aerial vehicle

Procedia PDF Downloads 236
39875 Multi-Class Text Classification Using Ensembles of Classifiers

Authors: Syed Basit Ali Shah Bukhari, Yan Qiang, Saad Abdul Rauf, Syed Saqlaina Bukhari

Abstract:

Text Classification is the methodology to classify any given text into the respective category from a given set of categories. It is highly important and vital to use proper set of pre-processing , feature selection and classification techniques to achieve this purpose. In this paper we have used different ensemble techniques along with variance in feature selection parameters to see the change in overall accuracy of the result and also on some other individual class based features which include precision value of each individual category of the text. After subjecting our data through pre-processing and feature selection techniques , different individual classifiers were tested first and after that classifiers were combined to form ensembles to increase their accuracy. Later we also studied the impact of decreasing the classification categories on over all accuracy of data. Text classification is highly used in sentiment analysis on social media sites such as twitter for realizing people’s opinions about any cause or it is also used to analyze customer’s reviews about certain products or services. Opinion mining is a vital task in data mining and text categorization is a back-bone to opinion mining.

Keywords: Natural Language Processing, Ensemble Classifier, Bagging Classifier, AdaBoost

Procedia PDF Downloads 232
39874 A Review on Big Data Movement with Different Approaches

Authors: Nay Myo Sandar

Abstract:

With the growth of technologies and applications, a large amount of data has been producing at increasing rate from various resources such as social media networks, sensor devices, and other information serving devices. This large collection of massive, complex and exponential growth of dataset is called big data. The traditional database systems cannot store and process such data due to large and complexity. Consequently, cloud computing is a potential solution for data storage and processing since it can provide a pool of resources for servers and storage. However, moving large amount of data to and from is a challenging issue since it can encounter a high latency due to large data size. With respect to big data movement problem, this paper reviews the literature of previous works, discusses about research issues, finds out approaches for dealing with big data movement problem.

Keywords: Big Data, Cloud Computing, Big Data Movement, Network Techniques

Procedia PDF Downloads 87
39873 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 134
39872 Analysis of an High Voltage Direct Current (HVDC) Connection Using a Real-Time Simulator Under Various Disturbances

Authors: Mankour Mohamed, Miloudi Mohamed

Abstract:

A thorough and accurate simulation is necessary for the study of a High Voltage Direct Current (HVDC) link system during various types of disturbances, including internal faults on both converters, either on the rectifier or on the inverter, as well as external faults, such as AC or DC faults on both converter sides inside the DC link party. In this study, we examine how an HVDC inverter responds to three different types of failures, including faults at the inverter valve, system control faults, and single-phase-to-ground AC faults at the sending end of the inverter side. As this phenomenon represents the most frequent problem that may affect inverter valves, particularly those based on thyristor valves (LCC (line-Commutated converter)), it is more precise to explore which circumstance generates and raises the commutation failure on inverter valves. Because of the techniques used to accelerate the simulation, digital real-time simulators are now the most potent tools that provide simulation results. The real-time-lab RT-LAB platform HYPERSIM OP-5600 is used to implement the Simulation in the Loop (SIL) technique, which is used to validate the results. It is demonstrated how to recover from both the internal faults and the AC problem. The simulation findings show how crucial a role the control system plays in fault recovery.

Keywords: hypersim simulator, HVDC systems, mono-polar link, AC faults, misfiring faults

Procedia PDF Downloads 94
39871 The Study of Internship Performances: Comparison of Information Technology Interns towards Students’ Types and Background Profiles

Authors: Shutchapol Chopvitayakun

Abstract:

Internship program is a compulsory course of many undergraduate programs in Thailand. It gives opportunities to a lot of senior students as interns to practice their working skills in the real organizations and also gives chances for interns to face real-world working problems. Interns also learn how to solve those problems by direct and indirect experiences. This program in many schools is a well-structured course with a contract or agreement made with real business organizations. Moreover, this program also offers opportunities for interns to get jobs after completing it from where the internship program takes place. Interns also learn how to work as a team and how to associate with other colleagues, trainers, and superiors of each organization in term of social hierarchy, self-responsibility, and self-disciplinary. This research focuses on senior students of Suan Sunandha Rajabhat University, Thailand whose studying major is information technology program. They practiced their working skills or took internship programs in the real business sector or real operating organizations in 2015-2016. Interns are categorized in to two types: normal program and special program. For special program, students study in weekday evening from Monday to Friday or Weekend and most of them work full-time or part-time job. For normal program, students study in weekday working hours and most of them do not work. The differences of these characters and the outcomes of internship performance were studied and analyzed in this research. This work applied some statistical analytics to find out whether the internship performance of each intern type has different performances statistically or not.

Keywords: internship, intern, senior student, information technology program

Procedia PDF Downloads 263
39870 A Case Study on Quantitatively and Qualitatively Increasing Student Output by Using Available Word Processing Applications to Teach Reluctant Elementary School-Age Writers

Authors: Vivienne Cameron

Abstract:

Background: Between 2010 and 2017, teachers in a suburban public school district struggled to get students to consistently produce adequate writing samples as measured by the Pennsylvania state writing rubric for measuring focus, content, organization, style, and conventions. A common thread in all of the data was the need to develop stamina in the student writers. Method: All of the teachers used the traditional writing process model (prewrite, draft, revise, edit, final copy) during writing instruction. One teacher taught the writing process using word processing and incentivizing with publication instead of the traditional pencil/paper/grading method. Students did not have instruction in typing/keyboarding. The teacher submitted resulting student work to real-life contests, magazines, and publishers. Results: Students in the test group increased both the quantity and quality of their writing over a seven month period as measured by the Pennsylvania state writing rubric. Reluctant writers, as well as students with autism spectrum disorder, benefited from this approach. This outcome was repeated consistently over a five-year period. Interpretation: Removing the burden of pencil and paper allowed students to participate in the writing process more fully. Writing with pencil and paper is physically tiring. Students are discouraged when they submit a draft and are instructed to use the Add, Remove, Move, Substitute (ARMS) method to revise their papers. Each successive version becomes shorter. Allowing students to type their papers frees them to quickly and easily make changes. The result is longer writing pieces in shorter time frames, allowing the teacher to spend more time working on individual needs. With this additional time, the teacher can concentrate on teaching focus, content, organization, style, conventions, and audience. S/he also has a larger body of works from which to work on whole group instruction such as developing effective leads. The teacher submitted the resulting student work to contests, magazines, and publishers. Although time-consuming, the submission process was an invaluable lesson for teaching about audience and tone. All students in the test sample had work accepted for publication. Students became highly motivated to succeed when their work was accepted for publication. This motivation applied to special needs students, regular education students, and gifted students.

Keywords: elementary-age students, reluctant writers, teaching strategies, writing process

Procedia PDF Downloads 175
39869 Frequent Item Set Mining for Big Data Using MapReduce Framework

Authors: Tamanna Jethava, Rahul Joshi

Abstract:

Frequent Item sets play an essential role in many data Mining tasks that try to find interesting patterns from the database. Typically it refers to a set of items that frequently appear together in transaction dataset. There are several mining algorithm being used for frequent item set mining, yet most do not scale to the type of data we presented with today, so called “BIG DATA”. Big Data is a collection of large data sets. Our approach is to work on the frequent item set mining over the large dataset with scalable and speedy way. Big Data basically works with Map Reduce along with HDFS is used to find out frequent item sets from Big Data on large cluster. This paper focuses on using pre-processing & mining algorithm as hybrid approach for big data over Hadoop platform.

Keywords: frequent item set mining, big data, Hadoop, MapReduce

Procedia PDF Downloads 436
39868 Effect of the Deposition Time of Hydrogenated Nanocrystalline Si Grown on Porous Alumina Film on Glass Substrate by Plasma Processing Chemical Vapor Deposition

Authors: F. Laatar, S. Ktifa, H. Ezzaouia

Abstract:

Plasma Enhanced Chemical Vapor Deposition (PECVD) method is used to deposit hydrogenated nanocrystalline silicon films (nc-Si: H) on Porous Anodic Alumina Films (PAF) on glass substrate at different deposition duration. Influence of the deposition time on the physical properties of nc-Si: H grown on PAF was investigated through an extensive correlation between micro-structural and optical properties of these films. In this paper, we present an extensive study of the morphological, structural and optical properties of these films by Atomic Force Microscopy (AFM), X-Ray Diffraction (XRD) techniques and a UV-Vis-NIR spectrometer. It was found that the changes in DT can modify the films thickness, the surface roughness and eventually improve the optical properties of the composite. Optical properties (optical thicknesses, refractive indexes (n), absorption coefficients (α), extinction coefficients (k), and the values of the optical transitions EG) of this kind of samples were obtained using the data of the transmittance T and reflectance R spectra’s recorded by the UV–Vis–NIR spectrometer. We used Cauchy and Wemple–DiDomenico models for the analysis of the dispersion of the refractive index and the determination of the optical properties of these films.

Keywords: hydragenated nanocrystalline silicon, plasma processing chemical vapor deposition, X-ray diffraction, optical properties

Procedia PDF Downloads 377
39867 Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique

Authors: Chalakorn Chitsaart, Suchada Rianmora, Noppawat Vongpiyasatit

Abstract:

In order to reduce the transportation time and cost for direct interface between customer and manufacturer, the image processing technique has been introduced in this research where designing part and defining manufacturing process can be performed quickly. A3D virtual model is directly generated from a series of multi-view images of an object, and it can be modified, analyzed, and improved the structure, or function for the further implementations, such as computer-aided manufacturing (CAM). To estimate and quote the production cost, the user-friendly platform has been developed in this research where the appropriate manufacturing parameters and process detections have been identified and planned by CAM simulation.

Keywords: image processing technique, feature detections, surface registrations, capturing multi-view images, Production costs and Manufacturing processes

Procedia PDF Downloads 251
39866 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria

Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe

Abstract:

Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.

Keywords: data portal, data infrastructure, open source, sustainability

Procedia PDF Downloads 98
39865 A Method of Detecting the Difference in Two States of Brain Using Statistical Analysis of EEG Raw Data

Authors: Digvijaysingh S. Bana, Kiran R. Trivedi

Abstract:

This paper introduces various methods for the alpha wave to detect the difference between two states of brain. One healthy subject participated in the experiment. EEG was measured on the forehead above the eye (FP1 Position) with reference and ground electrode are on the ear clip. The data samples are obtained in the form of EEG raw data. The time duration of reading is of one minute. Various test are being performed on the alpha band EEG raw data.The readings are performed in different time duration of the entire day. The statistical analysis is being carried out on the EEG sample data in the form of various tests.

Keywords: electroencephalogram(EEG), biometrics, authentication, EEG raw data

Procedia PDF Downloads 464
39864 Analysis of Relative Gene Expression Data of GATA3-AS1 Associated with Resistance to Neoadjuvant Chemotherapy in Locally Advanced Breast Cancer Patients of Luminal B Subtype

Authors: X. Cervantes-López, C. Arriaga-Canon, L. Contreras Espinosa

Abstract:

The goal of this study is to validate the overexpression of the lncRNA GATA3-AS1 associated with resistance to neoadjuvant chemotherapy of female patients with locally advanced mammary adenocarcinoma of luminal B subtype This study involved a cohort of one hundred thirty-seven samples for which total RNA was isolated from formalin fixed paraffin embedded (FFPE) tissue. Samples were cut using a Microtome Hyrax M25 Zeiss and RNA was isolated using the RNeasy FFPE kit and a deparaffinization solution, the next step consisted in the analysis of RNA concentration and quality, then 18 µg of RNA was treated with DNase I, and cDNA was synthesized from 50 ng total RNA, finally real-time PCR was performed with SYBR Green/ROX qPCR Master Mix in order to determined relative gene expression using RPS28 as a housekeeping gene to normalize in a fold calculation ΔCt. As a result, we validated by real-time PCR that the overexpression of the lncRNA GATA3-AS1 is associated with resistance to neoadjuvant chemotherapy in locally advanced breast cancer patients of luminal B subtype.

Keywords: breast cancer, biomarkers, genomics, neoadjuvant chemotherapy, lncRNAS

Procedia PDF Downloads 55
39863 Analyzing the Empirical Link between Islamic Finance and Growth of Real Output: A Time Series Application to Pakistan

Authors: Nazima Ellahi, Danish Ramzan

Abstract:

There is a growing trend among development economists regarding the importance of financial sector for economic development and growth activities. The development thus introduced, helps to promote welfare effects and poverty alleviation. This study is an attempt to find the nature of link between Islamic banking financing and development of output growth for Pakistan. Time series data set has been utilized for a time period ranging from 1990 to 2010. Following the Phillip Perron (PP) and Augmented Dicky Fuller (ADF) test of unit root this study applied Ordinary Least Squares (OLS) method of estimation and found encouraging results in favor of promoting the Islamic banking practices in Pakistan.

Keywords: Islamic finance, poverty alleviation, economic growth, finance, commerce

Procedia PDF Downloads 345
39862 Altered Gene Expression: Induction/Suppression of some Pathogenesis Related Protein Genes in an Egyptian Isolate of Potato Leafroll Virus (PLRV)

Authors: Dalia G. Aseel

Abstract:

The potato (Solanum tubersum, L.) has become one of the major vegetable crops in Egypt and all over the world. Potato leafroll virus(PLRV) was observed on potato plants collected from different governorates in Egypt. Three cultivars, Spunta, Diamont, and Cara, infected with PLRV were collected; RNA was extracted and subjected to Real-Time PCR using the coat protein gene primers. The results showed that the expression of the coat protein was 39.6-fold, 12.45-fold, and 47.43-fold, respectively, for Spunta, Diamont, and Cara cultivars. Differential Display Polymerase Chain Reaction (DD-PCR) using pathogenesis-related protein 1 (PR-1), β-1,3-glucanases (PR-2), chitinase (PR-3), peroxidase (POD), and polyphenol oxidase (PPO) forward primers for pathogenesis-related proteins (PR). The obtained data revealed different banding patterns depending on the viral type and the region of infection. Regarding PLRV, 58 up-regulated and 19 down-regulated genes were detected. Sequence analysis of the up-and down-regulated genes revealed that infected plants were observed in comparison with the healthy control. Sequence analysis of the up-regulated gene was performed, and the encoding sequence analysis showed that the obtained genes include: induced stolen tip protein. On the other hand, two down-regulated genes were identified: disease resistance RPP-like protein and non-specific lipid-transfer protein. In this study, the expressions of PR-1, PR-2, PR-3, POD, and PPO genes in the infected leaves of three potato cultivars were estimated by quantitative real-time PCR. We can conclude that the PLRV-infection of potato plants inhibited the expression of the five PR genes. On the contrary, infected leaves by PLRV elevated the expression of some defense genes. This interaction may also induce and/or suppress the expression of some genes responsible for the plant's defense mechanisms.

Keywords: PLRV, pathogenesis-related proteins (PRs), DD-PCR, sequence, real-time PCR

Procedia PDF Downloads 142
39861 DMBR-Net: Deep Multiple-Resolution Bilateral Networks for Real-Time and Accurate Semantic Segmentation

Authors: Pengfei Meng, Shuangcheng Jia, Qian Li

Abstract:

We proposed a real-time high-precision semantic segmentation network based on a multi-resolution feature fusion module, the auxiliary feature extracting module, upsampling module, and atrous spatial pyramid pooling (ASPP) module. We designed a feature fusion structure, which is integrated with sufficient features of different resolutions. We also studied the effect of side-branch structure on the network and made discoveries. Based on the discoveries about the side-branch of the network structure, we used a side-branch auxiliary feature extraction layer in the network to improve the effectiveness of the network. We also designed upsampling module, which has better results than the original upsampling module. In addition, we also re-considered the locations and number of atrous spatial pyramid pooling (ASPP) modules and modified the network structure according to the experimental results to further improve the effectiveness of the network. The network presented in this paper takes the backbone network of Bisenetv2 as a basic network, based on which we constructed a network structure on which we made improvements. We named this network deep multiple-resolution bilateral networks for real-time, referred to as DMBR-Net. After experimental testing, our proposed DMBR-Net network achieved 81.2% mIoU at 119FPS on the Cityscapes validation dataset, 80.7% mIoU at 109FPS on the CamVid test dataset, 29.9% mIoU at 78FPS on the COCOStuff test dataset. Compared with all lightweight real-time semantic segmentation networks, our network achieves the highest accuracy at an appropriate speed.

Keywords: multi-resolution feature fusion, atrous convolutional, bilateral networks, pyramid pooling

Procedia PDF Downloads 150
39860 Detection of Clipped Fragments in Speech Signals

Authors: Sergei Aleinik, Yuri Matveev

Abstract:

In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.

Keywords: clipping, clipped signal, speech signal processing, digital signal processing

Procedia PDF Downloads 393
39859 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 63
39858 Giant Achievements in Food Processing

Authors: Farnaz Amidi Fazli

Abstract:

After long period of human experience about food processing from raw eating to canning of food in the last century now it is time to use novel technologies which are sometimes completely different from common technologies. It is possible to decontaminate food without using heat or the foods are stored without using cold chain. Pulsed electric field (PEF) processing is a non-thermal method of food preservation that uses short bursts of electricity, PEF can be used for processing liquid and semi-liquid food products. PEF processing offers high quality fresh-like liquid foods with excellent flavor, nutritional value, and shelf-life. High pressure processing (HPP) technology has the potential to fulfill both consumer and scientific requirements. The use of HPP for over 50 years has found applications in non-food industries. For food applications, ‘high pressure’ can be generally considered to be up to 600 MPa for most food products. After years, freezing has its high potential to food preservation due to new and quick freezing methods. Foods which are prepared by this technology have more acceptability and high quality comparing with old fashion slow freezing. Thus, quick freezing has further been adopted as a widespread commercial method for long-term preservation of perishable foods which improved both the health and convenience of everyone in the industrialised countries. Above parameters are achieved by Fluidised-bed freezing systems, freezing by immersion and Hydrofluidisation on the other hand new thawing methods like high-pressure, microwave, ohmic, and acoustic thawing have a key role in quality and adaptability of final product.

Keywords: quick freezing, thawing, high pressure, pulse electric, hydrofluidisation

Procedia PDF Downloads 321
39857 Part of Speech Tagging Using Statistical Approach for Nepali Text

Authors: Archit Yajnik

Abstract:

Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.

Keywords: hidden markov model, natural language processing, POS tagging, viterbi algorithm

Procedia PDF Downloads 329
39856 Price Heterogeneity in Establishing Real Estate Composite Price Index as Underlying Asset for Property Derivatives in Russia

Authors: Andrey Matyukhin

Abstract:

Russian official statistics have been showing a steady decline in residential real estate prices for several consecutive years. Price risk in real estate markets is thus affecting various groups of economic agents, namely, individuals, construction companies and financial institutions. Potential use of property derivatives might help mitigate adverse consequences of negative price dynamics. Unless a sustainable price indicator is developed, settlement of such instruments imposes constraints on counterparties involved while imposing restrictions on real estate market development. The study addresses geographical and classification heterogeneity in real estate prices by means of variance analysis in various groups of real estate properties. In conclusion, we determine optimal sample structure of representative real estate assets with sufficient level of price homogeneity. The composite price indicator based on the sample would have a higher level of robustness and reliability and hence improving liquidity in the market for property derivatives through underlying standardization. Unlike the majority of existing real estate price indices, calculated on country-wide basis, the optimal indices for Russian market shall be constructed on the city-level.

Keywords: price homogeneity, property derivatives, real estate price index, real estate price risk

Procedia PDF Downloads 307
39855 Challenges with Synchrophasor Technology Deployments in Electric Power Grids

Authors: Emmanuel U. Oleka, Anil Khanal, Gary L. Lebby, Ali R. Osareh

Abstract:

Synchrophasor technology is fast being deployed in electric power grids all over the world and is fast changing the way the grids are managed. This trend is to continue until the entire power grids are fully connected so they can be monitored and controlled in real-time. Much achievement has been made in the synchrophasor technology development and deployment, and much more are yet to be achieved. Real-time power grid control and protection potentials of synchrophasor are yet to be explored. It is of necessity that researchers keep in view the various challenges that still need to be overcome in expanding the frontiers of synchrophasor technology. This paper outlines the major challenges that should be dealt with in order to achieve the goal of total power grid visualization, monitoring and control using synchrophasor technology.

Keywords: electric power grid, grid visualization, phasor measurement unit, synchrophasor

Procedia PDF Downloads 556
39854 Spatially Random Sampling for Retail Food Risk Factors Study

Authors: Guilan Huang

Abstract:

In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.

Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling

Procedia PDF Downloads 350
39853 A Genetic Algorithm Approach to Solve a Weaving Job Scheduling Problem, Aiming Tardiness Minimization

Authors: Carolina Silva, João Nuno Oliveira, Rui Sousa, João Paulo Silva

Abstract:

This study uses genetic algorithms to solve a job scheduling problem in a weaving factory. The underline problem regards an NP-Hard problem concerning unrelated parallel machines, with sequence-dependent setup times. This research uses real data regarding a weaving industry located in the North of Portugal, with a capacity of 96 looms and a production, on average, of 440000 meters of fabric per month. Besides, this study includes a high level of complexity once most of the real production constraints are applied, and several real data instances are tested. Topics such as data analyses and algorithm performance are addressed and tested, to offer a solution that can generate reliable and due date results. All the approaches will be tested in the operational environment, and the KPIs monitored, to understand the solution's impact on the production, with a particular focus on the total number of weeks of late deliveries to clients. Thus, the main goal of this research is to develop a solution that allows for the production of automatically optimized production plans, aiming to the tardiness minimizing.

Keywords: genetic algorithms, textile industry, job scheduling, optimization

Procedia PDF Downloads 157
39852 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network

Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang

Abstract:

As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.

Keywords: GUI, deep learning, GAN, data augmentation

Procedia PDF Downloads 184
39851 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 512
39850 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot

Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan

Abstract:

With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.

Keywords: object detection, feature, descriptors, SIFT, SURF, depth images, service robots

Procedia PDF Downloads 546