Search results for: data utilization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25973

Search results for: data utilization

24983 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 250
24982 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 68
24981 Role of Biotechnology to Reduce Climate-Induced Impacts

Authors: Sandani Muthukumarana, Pavithra Rathnasiri

Abstract:

Climate change is one of the greatest challenges our generation faces, but by embracing biotechnology, we can turn this challenge into an opportunity to grow the economy. Biotechnology provides the sector with a range of solutions that help mitigate the effects of global warming. However, research efforts on investigating the potential and challenges for further utilization of biotechnology to mitigate climate change impacts are still lacking. To address this issue, existing context over the use of biotechnology for climate change mitigation, potential applications, practices being used, and challenges that exist need to be investigated to provide a broader understanding for future researchers and practitioners. This paper, therefore, reviews the existing literature addressing these perspectives to facilitate the application of biotechnology in mitigating hazards arising from climate change.

Keywords: climate change, impacts, biotechnology, solutions

Procedia PDF Downloads 83
24980 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 156
24979 Determination of the Needs for Development of Infertility Psycho-Educational Program and the Design of a Website about Infertility for University Students

Authors: Bahar Baran, Şirin Nur Kaptan, D.Yelda Kağnıcı, Erol Esen, Barışcan Öztürk, Ender Siyez, Diğdem M Siyez

Abstract:

It is known that some factors associated with infertility have preventable characteristics and that young people's knowledge levels in this regard are inadequate, but very few studies focus on effective prevention studies on infertility. Psycho-educational programs have an important place for infertility prevention efforts. Nowadays, considering the households' utilization rates from technology and the Internet, it seems that young people have applied to websites as a primary source of information related to a health problem they have encountered. However, one of the prerequisites for the effectiveness of websites or face-to-face psycho-education programs is to consider the needs of participants. In particular, it is expected that these programs will be appropriate to the cultural infrastructure and the diversity of beliefs and values in society. The aim of this research is to determine what university students want to learn about infertility and fertility and examine their views on the structure of the website. The sample of the research consisted of 9693 university students who study in 21 public higher education programs in Turkey. 51.6 % (n = 5002) were female and 48.4% (n = 4691) were male. The Needs Analysis Questionnaire developed by the researchers was used as data collection tool in the research. In the analysis of the data, descriptive analysis was conducted in SPSS software. According to the findings, among the topics that university students wanted to study about infertility and fertility, the first topics were 'misconceptions about infertility' (94.9 %), 'misconceptions about sexual behaviors' (94.6 %), 'factors affecting infertility' (92.8 %), 'sexual health and reproductive health' (92.5 %), 'sexually transmitted diseases' (92.7 %), 'sexuality and society' (90.9 %), 'healthy life (help centers)' (90.4 %). In addition, the questions about how the content of the website should be designed for university students were analyzed descriptively. According to the results, 91.5 % (n = 8871) of the university students proposed to use frequently asked questions and their answers, 89.2 % (n = 8648) stated that expert video should be included, 82.6 % (n = 8008) requested animations and simulations, 76.1 % (n = 7380) proposed different content according to sex and 66 % (n = 6460) proposed different designs according to sex. The results of the research indicated that the findings are similar to the contents of the program carried out in other countries in terms of the topics to be studied. It is suggested to take into account the opinions of the participants during the design of website.

Keywords: infertility, prevention, psycho-education, web based education

Procedia PDF Downloads 209
24978 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 103
24977 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 416
24976 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 58
24975 Utilization of Bauxite Residue in Construction Materials: An Experimental Study

Authors: Ryan Masoodi, Hossein Rostami

Abstract:

Aluminum has been credited for the massive advancement of many industrial products, from aerospace and automotive to electronics and even household appliances. These developments have come with a cost, which is a toxic by-product. The rise of aluminum production has been accompanied by the rise of a waste material called Bauxite Residue or Red Mud. This toxic material has been proved to be harmful to the environment, yet, there is no proper way to dispose or recycle it. Herewith, a new experimental method to utilize this waste in the building material is proposed. A method to mix red mud, fly ash, and some other ingredients is explored to create a new construction material that can satisfy the minimum required strength for bricks. It concludes that it is possible to produce bricks with enough strength that is suitable for constriction in environments with low to moderate weather conditions.

Keywords: bauxite residue, brick, red mud, recycling

Procedia PDF Downloads 161
24974 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran

Authors: Fatemeh Faramarzi, Hosein Mahjoob

Abstract:

Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.

Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6

Procedia PDF Downloads 309
24973 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 365
24972 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 424
24971 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 249
24970 Possibilities of Utilization Zeolite in Concrete

Authors: M. Sedlmajer, J. Zach, J. Hroudova, P. Rovnaníkova

Abstract:

There are several possibilities of reducing the required amount of cement in concrete production. Natural zeolite is one of the raw materials which can partly substitute Portland cement. The effort to reduce the amount of Portland cement used in concrete production is brings both economical as well as ecological benefits. The paper presents the properties of concrete containing natural zeolite as an active admixture in the concrete which partly substitutes Portland cement. The properties discussed here bring information about the basic mechanical properties and frost resistance of concrete containing zeolite. The properties of concretes with the admixture of zeolite are compared with a reference concrete with no content of zeolite. The properties of the individual concretes are observed for 360 days.

Keywords: concrete, zeolite, compressive strength, modulus of elasticity, durability

Procedia PDF Downloads 361
24969 Green-Y Model for Preliminary Sustainable Economical Concept of Renewable Energy Sources Deployment in ASEAN Countries

Authors: H. H. Goh, K. C. Goh, W. N. Z. S. Wan Sukri, Q. S. Chua, S. W. Lee, B. C. Kok

Abstract:

Endowed of renewable energy sources (RES) are the advantages of ASEAN, but they are using a low amount of RES only to generate electricity because their primary energy sources are fossil and coal. The cost of purchasing fossil and coal is cheaper now, but it might be expensive soon, as it will be depleted sooner and after. ASEAN showed that the RES are convenient to be implemented. Some country in ASEAN has huge renewable energy sources potential and use. The primary aim of this project is to assist ASEAN countries in preparing the renewable energy and to guide the policies for RES in the more upright direction. The Green-Y model will help ASEAN government to study and forecast the economic concept, including feed-in tariff.

Keywords: ASEAN RES, Renewable Energy, RES Policies, RES Potential, RES Utilization

Procedia PDF Downloads 494
24968 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 266
24967 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 250
24966 Utilization of “Adlai” (Coix lacryma-jobi L) Flour as Wheat Flour Extender in Selected Baked Products in the Philippines

Authors: Rolando B. Llave Jr.

Abstract:

In many countries, wheat flour is used an essential component in production/preparation of bread and other baked products considered to have a significant role in man’s diet. Partial replacement of wheat flour with other flours (composite flour) in preparation of the said products is seen as a solution to the scarcity of wheat flour (in non-wheat producing countries), and improved nourishment. In composite flour, other flours may come from cereals, legumes, root crops, and those that are rich in starch. Many countries utilize whatever is locally available. “Adlai” or Job’s tears is a tall cereal plant that belongs to the same family of grass as wheat, rice, and corn. In some countries, it is used as an ingredient in producing many dishes and alcoholic and non-alcoholic beverages. As part of the Food Staple Self-Sufficiency Program (FSSP) of the Department of Agriculture (DA) in the Philippines, “adlai” is being promoted as alternative food source for the Filipinos. In this study, the grits coming from the seeds of “adlai” were turned into flour. The resulting flour was then used as partial replacement for wheat flour in selected baked products namely “pan de sal” (salt bread), cupcakes and cookies. The supplementation of “adlai” flour ranged 20%-45% with 20%-35% for “pan de sal”; 30%-45% for cupcakes; and 25% - 40% for cookies. The study was composed of four (4) phases. Phase I was product formulation studies. Phase II included the acceptability test/sensory evaluation of the baked products where the statistical analysis of the data gathered followed. Phase III was the computation of the theoretical protein content of the most acceptable “pan de sal”, cupcake and cookie, and lastly, in Phase IV, cost benefit was analyzed, specifically in terms of the direct material cost.

Keywords: “adlai”, composite flour, supplementation, sensory evaluation

Procedia PDF Downloads 859
24965 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 483
24964 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 549
24963 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 159
24962 The Design of a Die for the Processing of Aluminum through Equal Channel Angular Pressing

Authors: P. G. F. Siqueira, N. G. S. Almeida, P. M. A. Stemler, P. R. Cetlin, M. T. P. Aguilar

Abstract:

The processing of metals through Equal Channel Angular Pressing (ECAP) leads to their remarkable strengthening. The ECAP dies control the amount of strain imposed on the material through its geometry, especially through the angle between the die channels, and thus the microstructural and mechanical properties evolution of the material. The present study describes the design of an ECAP die whose utilization and maintenance are facilitated, and that also controls the eventual undesired flow of the material during processing. The proposed design was validated through numerical simulations procedures using commercial software. The die was manufactured according to the present design and tested. Tests using aluminum alloys also indicated to be suitable for the processing of higher strength alloys.

Keywords: ECAP, mechanical design, numerical methods, SPD

Procedia PDF Downloads 135
24961 In-Vitro Dextran Synthesis and Characterization of an Intracellular Glucosyltransferase from Leuconostoc Mesenteroides AA1

Authors: Afsheen Aman, Shah Ali Ul Qader

Abstract:

Dextransucrase [EC 2.4.1.5] is a glucosyltransferase that catalysis the biosynthesis of a natural biopolymer called dextran. It can catalyze the transfer of D-glucopyranosyl residues from sucrose to the main chain of dextran. This unique biopolymer has multiple applications in several industries and the key utilization of dextran lies on its molecular weight and the type of branching. Extracellular dextransucrase from Leuconostoc mesenteroides is most extensively studied and characterized. Limited data is available regarding cell-bound or intracellular dextransucrase and on the characterization of dextran produced by in-vitro reaction of intracellular dextransucrase. L. mesenteroides AA1 is reported to produce extracellular dextransucrase that catalyzes biosynthesis of a high molecular weight dextran with only α-(1→6) linkage. Current study deals with the characterization of an intracellular dextransucrase and in vitro biosynthesis of low molecular weight dextran from L. mesenteroides AA1. Intracellular dextransucrase was extracted from cytoplasm and purified to homogeneity for characterization. Kinetic constants, molecular weight and N-terminal sequence analysis of intracellular dextransucrase reveal unique variation with previously reported extracellular dextransucrase from the same strain. In vitro synthesized biopolymer was characterized using NMR spectroscopic techniques. Intracellular dextransucrase exhibited Vmax and Km values of 130.8 DSU ml-1 hr-1 and 221.3 mM, respectively. Optimum catalytic activity was detected at 35°C in 0.15 M citrate phosphate buffer (pH-5.5) in 05 minutes. Molecular mass of purified intracellular dextransucrase is approximately 220.0 kDa on SDS-PAGE. N-terminal sequence of the intracellular enzyme is: GLPGYFGVN that showed no homology with previously reported sequence for the extracellular dextransucrase. This intracellular dextransucrase is capable of in vitro synthesis of dextran under specific conditions. This intracellular dextransucrase is capable of in vitro synthesis of dextran under specific conditions and this biopolymer can be hydrolyzed into different molecular weight fractions for various applications.

Keywords: characterization, dextran, dextransucrase, leuconostoc mesenteroides

Procedia PDF Downloads 391
24960 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 401
24959 Effects of Probiotic Pseudomonas fluorescens on the Growth Performance, Immune Modulation, and Histopathology of African Catfish (Clarias gariepinus)

Authors: Nelson R. Osungbemiro, O. A. Bello-Olusoji, M. Oladipupo

Abstract:

This study was carried out to determine the effects of probiotics Pseudomonas fluorescens on the growth performance, histology examination and immune modulation of African Catfish, (Clarias gariepinus) challenged with Clostridium botulinum. P. fluorescens, and C. botulinum isolates were removed from the gut, gill and skin organs of procured adult samples of Clarias gariepinus from commercial fish farms in Akure, Ondo State, Nigeria. The physical and biochemical tests were performed on the bacterial isolates using standard microbiological techniques for their identification. Antibacterial activity tests on P. fluorescens showed inhibition zone with mean value of 3.7 mm which indicates high level of antagonism. The experimental diets were prepared at different probiotics bacterial concentration comprises of five treatments of different bacterial suspension, including the control (T1), T2 (10³), T3 (10⁵), T4 (10⁷) and T5 (10⁹). Three replicates for each treatment type were prepared. Growth performance and nutrients utilization indices were calculated. The proximate analysis of fish carcass and experimental diet was carried out using standard methods. After feeding for 70 days, haematological values and histological test were done following standard methods; also a subgroup from each experimental treatment was challenged by inoculating Intraperitonieally (I/P) with different concentration of pathogenic C. botulinum. Statistically, there were significant differences (P < 0.05) in the growth performance and nutrient utilization of C. gariepinus. Best weight gain and feed conversion ratio were recorded in fish fed T4 (10⁷) and poorest value obtained in the control. Haematological analyses of C. gariepinus fed the experimental diets indicated that all the fish fed diets with P. fluorescens had marked significantly (p < 0.05) higher White Blood Cell than the control diet. The results of the challenge test showed that fish fed the control diet had the highest mortality rate. Histological examination of the gill, intestine, and liver of fish in this study showed several histopathological alterations in fish fed the control diets compared with those fed the P. fluorescens diets. The study indicated that the optimum level of P. fluorescens required for C. gariepinus growth and white blood cells formation is 10⁷ CFU g⁻¹, while carcass protein deposition required 10⁵ CFU g⁻¹ of P. fluorescens concentration. The study also confirmed P. fluorescens as efficient probiotics that is capable of improving the immune response of C. gariepinus against the attack of a virulent fish pathogen, C. botulinum.

Keywords: Clarias gariepinus, Clostridium botulinum, probiotics, Pseudomonas fluorescens

Procedia PDF Downloads 153
24958 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 173
24957 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 80
24956 Awareness on Department of Education’s Disaster Risk Reduction Management Program at Oriental Mindoro National High School: Basis for Support School DRRM Program

Authors: Nimrod Bantigue

Abstract:

The Department of Education is continuously providing safe teaching-learning facilities and hazard-free environments to the learners. To achieve this goal, teachers’ awareness of DepEd’s DRRM programs and activities is extremely important; thus, this descriptive correlational quantitative study was conceptualized. This research answered four questions on the profile and level of awareness of the 153 teacher respondents of Oriental Mindoro National High School for the academic year 2018-2019. Stratified proportional sampling was employed, and both descriptive and inferential statistics were utilized to treat data. The findings revealed that the majority of the teachers at OMNHS are female and are in the age bracket of 20-40. Most are married and pursue graduate studies. They have moderate awareness of the Department of Education’s DRRM programs and activities in terms of assessment of risks activities, planning activities, implementation activities during disaster and evaluation and monitoring activities with 3.32, 3.12, 3.40 and 3.31 as computed means, respectively. Further, the result showed a significant relationship between the profile of the respondents such as age, civil status and educational attainment and the level of awareness. On the contrary, sex does not have a significant relationship with the level of awareness. The Support School DRRM program with Utilization Guide on School DRRM Manual was proposed to increase, improve and strengthen the weakest areas of awareness rated in each DRRM activity, such as assessment of risks, planning, and implementation during disasters and monitoring and evaluation.

Keywords: awareness, management, monitoring, risk reduction

Procedia PDF Downloads 210
24955 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning

Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah

Abstract:

Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.

Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning

Procedia PDF Downloads 20
24954 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: data science, non-negative matrix factorization, missing data, quality of services

Procedia PDF Downloads 124