Search results for: automated system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18130

Search results for: automated system

17800 Fully Autonomous Vertical Farm to Increase Crop Production

Authors: Simone Cinquemani, Lorenzo Mantovani, Aleksander Dabek

Abstract:

New technologies in agriculture are opening new challenges and new opportunities. Among these, certainly, robotics, vision, and artificial intelligence are the ones that will make a significant leap, compared to traditional agricultural techniques, possible. In particular, the indoor farming sector will be the one that will benefit the most from these solutions. Vertical farming is a new field of research where mechanical engineering can bring knowledge and know-how to transform a highly labor-based business into a fully autonomous system. The aim of the research is to develop a multi-purpose, modular, and perfectly integrated platform for crop production in indoor vertical farming. Activities will be based both on hardware development such as automatic tools to perform different activities on soil and plants, as well as research to introduce an extensive use of monitoring techniques based on machine learning algorithms. This paper presents the preliminary results of a research project of a vertical farm living lab designed to (i) develop and test vertical farming cultivation practices, (ii) introduce a very high degree of mechanization and automation that makes all processes replicable, fully measurable, standardized and automated, (iii) develop a coordinated control and management environment for autonomous multiplatform or tele-operated robots in environments with the aim of carrying out complex tasks in the presence of environmental and cultivation constraints, (iv) integrate AI-based algorithms as decision support system to improve quality production. The coordinated management of multiplatform systems still presents innumerable challenges that require a strongly multidisciplinary approach right from the design, development, and implementation phases. The methodology is based on (i) the development of models capable of describing the dynamics of the various platforms and their interactions, (ii) the integrated design of mechatronic systems able to respond to the needs of the context and to exploit the strength characteristics highlighted by the models, (iii) implementation and experimental tests performed to test the real effectiveness of the systems created, evaluate any weaknesses so as to proceed with a targeted development. To these aims, a fully automated laboratory for growing plants in vertical farming has been developed and tested. The living lab makes extensive use of sensors to determine the overall state of the structure, crops, and systems used. The possibility of having specific measurements for each element involved in the cultivation process makes it possible to evaluate the effects of each variable of interest and allows for the creation of a robust model of the system as a whole. The automation of the laboratory is completed with the use of robots to carry out all the necessary operations, from sowing to handling to harvesting. These systems work synergistically thanks to the knowledge of detailed models developed based on the information collected, which allows for deepening the knowledge of these types of crops and guarantees the possibility of tracing every action performed on each single plant. To this end, artificial intelligence algorithms have been developed to allow synergistic operation of all systems.

Keywords: automation, vertical farming, robot, artificial intelligence, vision, control

Procedia PDF Downloads 48
17799 Automated End-to-End Pipeline Processing Solution for Autonomous Driving

Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi

Abstract:

Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.

Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing

Procedia PDF Downloads 130
17798 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine

Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif

Abstract:

The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.

Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)

Procedia PDF Downloads 375
17797 A System Functions Set-Up through Near Field Communication of a Smartphone

Authors: Jaemyoung Lee

Abstract:

We present a method to set up system functions through a near filed communication (NFC) of a smartphone. The short communication distance of the NFC which is usually less than 4 cm could prevent any interferences from other devices and establish a secure communication channel between a system and the smartphone. The proposed set-up method for system function values is demonstrated for a blacbox system in a car. In demonstration, system functions of a blackbox which is manipulated through NFC of a smartphone are controls of image quality, sound level, shock sensing level to store images, etc. The proposed set-up method for system function values can be used for any devices with NFC.

Keywords: system set-up, near field communication, smartphone, android

Procedia PDF Downloads 339
17796 Radical Web Text Classification Using a Composite-Based Approach

Authors: Kolade Olawande Owoeye, George R. S. Weir

Abstract:

The widespread of terrorism and extremism activities on the internet has become a major threat to the government and national securities due to their potential dangers which have necessitated the need for intelligence gathering via web and real-time monitoring of potential websites for extremist activities. However, the manual classification for such contents is practically difficult or time-consuming. In response to this challenge, an automated classification system called composite technique was developed. This is a computational framework that explores the combination of both semantics and syntactic features of textual contents of a web. We implemented the framework on a set of extremist webpages dataset that has been subjected to the manual classification process. Therein, we developed a classification model on the data using J48 decision algorithm, this is to generate a measure of how well each page can be classified into their appropriate classes. The classification result obtained from our method when compared with other states of arts, indicated a 96% success rate in classifying overall webpages when matched against the manual classification.

Keywords: extremist, web pages, classification, semantics, posit

Procedia PDF Downloads 149
17795 AutoML: Comprehensive Review and Application to Engineering Datasets

Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili

Abstract:

The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.

Keywords: automated machine learning, uncertainty, engineering dataset, regression

Procedia PDF Downloads 65
17794 Using Autoencoder as Feature Extractor for Malware Detection

Authors: Umm-E-Hani, Faiza Babar, Hanif Durad

Abstract:

Malware-detecting approaches suffer many limitations, due to which all anti-malware solutions have failed to be reliable enough for detecting zero-day malware. Signature-based solutions depend upon the signatures that can be generated only when malware surfaces at least once in the cyber world. Another approach that works by detecting the anomalies caused in the environment can easily be defeated by diligently and intelligently written malware. Solutions that have been trained to observe the behavior for detecting malicious files have failed to cater to the malware capable of detecting the sandboxed or protected environment. Machine learning and deep learning-based approaches greatly suffer in training their models with either an imbalanced dataset or an inadequate number of samples. AI-based anti-malware solutions that have been trained with enough samples targeted a selected feature vector, thus ignoring the input of leftover features in the maliciousness of malware just to cope with the lack of underlying hardware processing power. Our research focuses on producing an anti-malware solution for detecting malicious PE files by circumventing the earlier-mentioned shortcomings. Our proposed framework, which is based on automated feature engineering through autoencoders, trains the model over a fairly large dataset. It focuses on the visual patterns of malware samples to automatically extract the meaningful part of the visual pattern. Our experiment has successfully produced a state-of-the-art accuracy of 99.54 % over test data.

Keywords: malware, auto encoders, automated feature engineering, classification

Procedia PDF Downloads 75
17793 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 329
17792 Automated Feature Detection and Matching Algorithms for Breast IR Sequence Images

Authors: Chia-Yen Lee, Hao-Jen Wang, Jhih-Hao Lai

Abstract:

In recent years, infrared (IR) imaging has been considered as a potential tool to assess the efficacy of chemotherapy and early detection of breast cancer. Regions of tumor growth with high metabolic rate and angiogenesis phenomenon lead to the high temperatures. Observation of differences between the heat maps in long term is useful to help assess the growth of breast cancer cells and detect breast cancer earlier, wherein the multi-time infrared image alignment technology is a necessary step. Representative feature points detection and matching are essential steps toward the good performance of image registration and quantitative analysis. However, there is no clear boundary on the infrared images and the subject's posture are different for each shot. It cannot adhesive markers on a body surface for a very long period, and it is hard to find anatomic fiducial markers on a body surface. In other words, it’s difficult to detect and match features in an IR sequence images. In this study, automated feature detection and matching algorithms with two type of automatic feature points (i.e., vascular branch points and modified Harris corner) are developed respectively. The preliminary results show that the proposed method could identify the representative feature points on the IR breast images successfully of 98% accuracy and the matching results of 93% accuracy.

Keywords: Harris corner, infrared image, feature detection, registration, matching

Procedia PDF Downloads 306
17791 New Technologies in Corporate Finance Management in the Digital Economy: Case of Kyrgyzstan

Authors: Marat Kozhomberdiev

Abstract:

The research will investigate the modern corporate finance management technologies currently used in the era of digitalization of the global economy and the degree to which financial institutions are utilizing these new technologies in the field of corporate finance management in Kyrgyzstan. The main purpose of the research is to reveal the role of financial management technologies as joint service centers, intercompany banks, specialized payment centers in the third-world country. Particularly, the analysis of the implacability of automated corporate finance management systems such as enterprise resource planning system (ERP) and treasury management system (TMS) will be carried out. Moreover, the research will investigate the role of cloud accounting systems in corporate finance management in Kyrgyz banks and whether it has any impact on the field of improving corporate finance management. The study will utilize a data collection process via surveying 3 banks in Kyrgyzstan, namely Mol-Bulak, RSK, and KICB. The banks were chosen based on their ownerships, such as state banks, private banks with local authorized capital, and private bank with international capital. The regression analysis will be utilized to reveal the correlation between the ownership of the bank and the use of new financial management technologies. The research will provide policy recommendations to both private and state banks on developing strategies for switching and utilizing modern corporate finance management technologies in their daily operations.

Keywords: digital economy, corporate finance, digital environment, digital technologies, cloud technologies, financial management

Procedia PDF Downloads 74
17790 A Large Language Model-Driven Method for Automated Building Energy Model Generation

Authors: Yake Zhang, Peng Xu

Abstract:

The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.

Keywords: artificial intelligence, building energy modelling, building simulation, large language model

Procedia PDF Downloads 34
17789 Steady State Analysis of Distribution System with Wind Generation Uncertainity

Authors: Zakir Husain, Neem Sagar, Neeraj Gupta

Abstract:

Due to the increased penetration of renewable energy resources in the distribution system, the system is no longer passive in nature. In this paper, a steady state analysis of the distribution system has been done with the inclusion of wind generation. The modeling of wind turbine generator system and wind generator has been made to obtain the average active and the reactive power injection into the system. The study has been conducted on a IEEE-33 bus system with two wind generators. The present research work is useful not only to utilities but also to customers.

Keywords: distributed generation, distribution network, radial network, wind turbine generating system

Procedia PDF Downloads 409
17788 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 430
17787 Automated, Objective Assessment of Pilot Performance in Simulated Environment

Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt

Abstract:

Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).

Keywords: automated assessment, flight simulator, human factors, pilot training

Procedia PDF Downloads 153
17786 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 115
17785 Way to Successful Enterprise Resource Planning System Implementation in Developing Countries: Case of Public Sector Unit

Authors: Suraj Kumar Mukti

Abstract:

Enterprise Resource Planning (ERP) system is a management tool to integrate all departments in an organization. It integrates business processes, manages resources efficiently and provides an appropriate decision support system to management. ERP system implementation is a typical and time taking process as well as money consuming process. Articles related to key success factors of ERP system implementation are available in the literature, but rare authors have focused on roadmap of successful ERP system implementation. Postponement is better if the organization is not ready to implement ERP system in better way; hence checking of organization’s preparation to adopt new system is an important prerequisite to ensure the success of ERP system implementation in an organization. Then comes what will be called as success of ERP system implementation. Benefits achieved by ERP system may be categorized into two categories; viz. tangible and intangible benefits. This research article presents a roadmap to ensure the success of ERP system implementation and benefits achieved through the new system as in success indicator. A case study is presented to evaluate the success and benefit achieved through the new system. The article gives a comprehensive approach to academicians and a roadmap to the organizations seeking to implement the ERP system.

Keywords: ERP system, decision support system, tangible, intangible

Procedia PDF Downloads 337
17784 Experimental Study and Evaluation of Farm Environmental Monitoring System Based on the Internet of Things, Sudan

Authors: Farid Eltom A. E., Mustafa Abdul-Halim, Abdalla Markaz, Sami Atta, Mohamed Azhari, Ahmed Rashed

Abstract:

Smart environment sensors integrated with ‘Internet of Things’ (IoT) technology can provide a new concept in tracking, sensing, and monitoring objects in the environment. The aim of the study is to evaluate the farm environmental monitoring system based on (IoT) and to realize the automated management of agriculture and the implementation of precision production. Until now, irrigation monitoring operations in Sudan have been carried out using traditional methods, which is a very costly and unreliable mechanism. However, by utilizing soil moisture sensors, irrigation can be conducted only when needed without fear of plant water stress. The result showed that software application allows farmers to display current and historical data on soil moisture and nutrients in the form of line charts. Design measurements of the soil factors: moisture, electrical, humidity, conductivity, temperature, pH, phosphorus, and potassium; these factors, together with a timestamp, are sent to the data server using the Lora WAN interface. It is considered scientifically agreed upon in the modern era that artificial intelligence works to arrange the necessary procedures to take care of the terrain, predict the quality and quantity of production through deep analysis of the various operations in agricultural fields, and also support monitoring of weather conditions.

Keywords: smart environment, monitoring systems, IoT, LoRa Gateway, center pivot

Procedia PDF Downloads 50
17783 Retrospective Study of Bronchial Secretions Cultures Carried out in the Microbiology Department of General Hospital of Ioannina in 2017

Authors: S. Mantzoukis, M. Gerasimou, P. Christodoulou, N. Varsamis, G. Kolliopoulou, N. Zotos

Abstract:

Purpose: Patients in Intensive Care Units (ICU) are exposed to a different spectrum of microorganisms relative to the hospital. Due to the fact that the majority of these patients are intubated, bronchial secretions should be examined. Material and Method: Bronchial secretions should be taken with care so as not to be mixed with sputum or saliva. The bronchial secretions are placed in a sterile container and then inoculated into blood, Mac Conkey No2, Chocolate, Mueller Hinton, Chapman and Saboureaud agar. After this period, if any number of microbial colonies are detected, gram staining is performed and then the isolated organisms are identified by biochemical techniques in the automated Microscan system (Siemens) followed by a sensitivity test in the same system using the minimum inhibitory concentration MIC technique. The sensitivity test is verified by a Kirby Bauer test. Results: In 2017 the Laboratory of Microbiology received 365 samples of bronchial secretions from the Intensive Care Unit. 237 were found positive. S. epidermidis was identified in 1 specimen, A. baumannii in 60, K. pneumoniae in 42, P. aeruginosa in 50, C. albicans in 40, P. mirabilis in 4, E. coli in 4, S. maltophilia in 6, S. marcescens in 6, S. aureus in 12, S. pneumoniae in 1, S. haemolyticus in 4, P. fluorescens in 1, E. aerogenes in 1, E. cloacae in 5. Conclusions: The majority of ICU patients appear to be a fertile ground for the development of infections. The nature of the findings suggests that a significant part of the bacteria found comes from the unit (nosocomial infection).

Keywords: bronchial secretions, cultures, infections, intensive care units

Procedia PDF Downloads 188
17782 Defining a Reference Architecture for Predictive Maintenance Systems: A Case Study Using the Microsoft Azure IoT-Cloud Components

Authors: Walter Bernhofer, Peter Haber, Tobias Mayer, Manfred Mayr, Markus Ziegler

Abstract:

Current preventive maintenance measures are cost intensive and not efficient. With the available sensor data of state of the art internet of things devices new possibilities of automated data processing emerge. Current advances in data science and in machine learning enable new, so called predictive maintenance technologies, which empower data scientists to forecast possible system failures. The goal of this approach is to cut expenses in preventive maintenance by automating the detection of possible failures and to improve efficiency and quality of maintenance measures. Additionally, a centralization of the sensor data monitoring can be achieved by using this approach. This paper describes the approach of three students to define a reference architecture for a predictive maintenance solution in the internet of things domain with a connected smartphone app for service technicians. The reference architecture is validated by a case study. The case study is implemented with current Microsoft Azure cloud technologies. The results of the case study show that the reference architecture is valid and can be used to achieve a system for predictive maintenance execution with the cloud components of Microsoft Azure. The used concepts are technology platform agnostic and can be reused in many different cloud platforms. The reference architecture is valid and can be used in many use cases, like gas station maintenance, elevator maintenance and many more.

Keywords: case study, internet of things, predictive maintenance, reference architecture

Procedia PDF Downloads 256
17781 Comparison of Growth Medium Efficiency into Stevia (Stevia rebaudiana Bertoni) Shoot Biomass and Stevioside Content in Thin-Layer System, TIS RITA® Bioreactor, and Bubble Column Bioreactor

Authors: Nurhayati Br Tarigan, Rizkita Rachmi Esyanti

Abstract:

Stevia (Stevia rebaudiana Bertoni) has a great potential to be used as a natural sweetener because it contains steviol glycoside, which is approximately 100 - 300 times sweeter than sucrose, yet low calories. Vegetative and generative propagation of S. rebaudiana is inefficient to produce stevia biomass and stevioside. One of alternative for stevia propagation is in vitro shoot culture. This research was conducted to optimize the best medium for shoot growth and to compare the bioconversion efficiency and stevioside production of S. rebaudiana shoot culture cultivated in thin layer culture (TLC), recipient for automated temporary immersion system (TIS RITA®) bioreactor, and bubble column bioreactor. The result showed that 1 ppm of Kinetin produced a healthy shoot and the highest number of leaves compared to BAP. Shoots were then cultivated in TLC, TIS RITA® bioreactor, and bubble column bioreactor. Growth medium efficiency was determined by yield and productivity. TLC produced the highest growth medium efficiency of S. rebaudiana, the yield was 0.471 ± 0.117 gbiomass.gsubstrate-1, and the productivity was 0.599 ± 0.122 gbiomass.Lmedium-1.day-1. While TIS RITA® bioreactor produced the lowest yield and productivity, 0.182 ± 0.024 gbiomass.gsubstrate-1 and 0.041 ± 0.0002 gbiomass.Lmedium-1.day-1 respectively. The yield of bubble column bioreactor was 0.354 ± 0.204 gbiomass.gsubstrate-1 and the productivity was 0,099 ± 0,009 gbiomass.Lmedium-1.day-1. The stevioside content from the highest to the lowest was obtained from stevia shoot which was cultivated on TLC, TIS RITA® bioreactor, and bubble column bioreactor; the content was 93,44 μg/g, 42,57 μg/g, and 23,03 μg/g respectively. All three systems could be used to produce stevia shoot biomass, but optimization on the number of nutrition and oxygen intake was required in each system.

Keywords: bubble column, growth medium efficiency, Stevia rebaudiana, stevioside, TIS RITA®, TLC

Procedia PDF Downloads 271
17780 Representation of the Solution of One Dynamical System on the Plane

Authors: Kushakov Kholmurodjon, Muhammadjonov Akbarshox

Abstract:

This present paper is devoted to a system of second-order nonlinear differential equations with a special right-hand side, exactly, the linear part and a third-order polynomial of a special form. It is shown that for some relations between the parameters, there is a second-order curve in which trajectories leaving the points of this curve remain in the same place. Thus, the curve is invariant with respect to the given system. Moreover, this system is invariant under a non-degenerate linear transformation of variables. The form of this curve, depending on the relations between the parameters and the eigenvalues of the matrix, is proved. All solutions of this system of differential equations are shown analytically.

Keywords: dynamic system, ellipse, hyperbola, Hess system, polar coordinate system

Procedia PDF Downloads 197
17779 Fake News Detection for Korean News Using Machine Learning Techniques

Authors: Tae-Uk Yun, Pullip Chung, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection using machine learning techniques over the past years. But, there have been no prior studies proposed an automated fake news detection method for Korean news to our best knowledge. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (topic modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as logistic regression, backpropagation network, support vector machine, and deep neural network can be applied. To validate the effectiveness of the proposed method, we collected about 200 short Korean news from Seoul National University’s FactCheck. which provides with detailed analysis reports from 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.

Keywords: fake news detection, Korean news, machine learning, text mining

Procedia PDF Downloads 277
17778 Investor Sentiment and Satisfaction in Automated Investment: A Sentimental Analysis of Robo-Advisor Platforms

Authors: Vertika Goswami, Gargi Sharma

Abstract:

The rapid evolution of fintech has led to the rise of robo-advisor platforms that utilize artificial intelligence (AI) and machine learning to offer personalized investment solutions efficiently and cost-effectively. This research paper conducts a comprehensive sentiment analysis of investor experiences with these platforms, employing natural language processing (NLP) and sentiment classification techniques. The study investigates investor perceptions, engagement, and satisfaction, identifying key drivers of positive sentiment such as clear communication, low fees, consistent returns, and robust security. Conversely, negative sentiment is linked to issues like inconsistent performance, hidden fees, poor customer support, and a lack of transparency. The analysis reveals that addressing these pain points—through improved transparency, enhanced customer service, and ongoing technological advancements—can significantly boost investor trust and satisfaction. This paper contributes valuable insights into the fields of behavioral finance and fintech innovation, offering actionable recommendations for stakeholders, practitioners, and policymakers. Future research should explore the long-term impact of these factors on investor loyalty, the role of emerging technologies, and the effects of ethical investment choices and regulatory compliance on investor sentiment.

Keywords: artificial intelligence in finance, automated investment, financial technology, investor satisfaction, investor sentiment, robo-advisors, sentimental analysis

Procedia PDF Downloads 24
17777 From Linear to Nonlinear Deterrence: Deterrence for Rising Power

Authors: Farhad Ghasemi

Abstract:

Along with transforming the international system into a complex and chaotic system, the fundamental question arises: how can deterrence be reconstructed conceptually and theoretically in this system model? The deterrence system is much more complex today than it was seven decades ago. This article suggests that the perception of deterrence as a linear system is a fundamental mistake because it does not consider the new dynamics of the international system, including network power dynamics. The author aims to improve this point by focusing on complexity and chaos theories, especially their nonlinearity and cascading failure principles. This article proposes that the perception of deterrence as a linear system is a fundamental mistake, as the new dynamics of the surrounding international system do not take into account. The author recognizes deterrence as a nonlinear system and introduces it as a concept in strategic studies.

Keywords: complexity, international system, deterrence, linear deterrence, nonlinear deterrence

Procedia PDF Downloads 144
17776 A Tool for Rational Assessment of Dynamic Trust in Networked Organizations

Authors: Simon Samwel Msanjila

Abstract:

Networked environments which provides platforms and environments for business organizations are configured in different forms depending on many factors including life time, member characteristics, communication structure, and business objectives, among others. With continuing advances in digital technologies the distance has become a less barrier for business minded collaboration among organizations. With the need and ease to make business collaborate nowadays organizations are sometimes forced to co-work with others that are either unknown or less known to them in terms of history and performance. A promising approach for sustaining established collaboration has been establishment of trust relationship among organizations based on assessed trustworthiness for each participating organization. It has been stated in research that trust in organization is dynamic and thus assessment of trust level must address such dynamic nature. This paper assess relevant aspects of trust and applies the concepts to propose a semi-automated system for assessing the Sustainability and Evolution of trust in organizations participating in specific objective in a networked organizations environment.

Keywords: trust evolution, trust sustainability, networked organizations, dynamic trust

Procedia PDF Downloads 435
17775 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use

Authors: Mayank Mundhra, Chester Rebeiro

Abstract:

Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.

Keywords: Ripple, Kelips, unique node list, consensus, information propagation

Procedia PDF Downloads 153
17774 A Performance Study of a Solar Heating System on the Microclimate of an Agricultural Greenhouse

Authors: Nora Arbaoui, Rachid Tadili

Abstract:

This study focuses on a solar system designed to heat an agricultural greenhouse. This solar system is based on the heating of a transfer fluid that circulates inside the greenhouse through a solar copper coil integrated into the roof of the greenhouse. The thermal energy stored during the day will be released during the night to improve the microclimate of the greenhouse. This system was tested in a small agricultural greenhouse in order to ameliorate the different operational parameters. The climatic and agronomic results obtained with this system are significant in comparison with a greenhouse with no heating system.

Keywords: solar system, agricultural greenhouse, heating, storage, drying

Procedia PDF Downloads 92
17773 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition

Procedia PDF Downloads 162
17772 Effects of AI-driven Applications on Bank Performance in West Africa

Authors: Ani Wilson Uchenna, Ogbonna Chikodi

Abstract:

This study examined the impact of artificial intelligence driven applications on banks’ performance in West Africa using Nigeria and Ghana as case studies. Specifically, the study examined the extent to which deployment of smart automated teller machine impacts the banks’ net worth within the reference period in Nigeria and Ghana. It ascertained the impact of point of sale on banks’ net worth within the reference period in Nigeria and Ghana. Thirdly, it verified the extent to which webpay services can influence banks’ performance in Nigeria and Ghana and finally, determined the impact of mobile pay services on banks’ performance in Nigeria and Ghana. The study used automated teller machine (ATM), Point of sale services (POS), Mobile pay services (MOP) and Web pay services (WBP) as proxies for explanatory variables while Bank net worth was used as explained variable for the study. The data for this study were sourced from central bank of Nigeria (CBN) Statistical Bulletin as well as Bank of Ghana (BoGH) Statistical Bulletin, Ghana payment systems oversight annual report and world development indicator (WDI). Furthermore, the mixed order of integration observed from the panel unit test result justified the use of autoregressive distributed lag (ARDL) approach to data analysis which the study adopted. While the cointegration test showed the existence of cointegration among the studied variables, bound test result justified the presence of long-run relationship among the series. Again, ARDL error correction estimate established satisfactory (13.92%) speed of adjustment from long run disequilibrium back to short run dynamic relationship. The study found that while Automated teller machine (ATM) had statistically significant impact on bank net worth (BNW) of Nigeria and Ghana, point of sale services application (POS) statistically and significantly impact on bank net worth within the study period, mobile pay services application was statistically significant in impacting the changes in the bank net worth of the countries of study while web pay services (WBP) had no statistically significant impact on bank net worth of the countries of reference. The study concluded that artificial intelligence driven application have significant an positive impact on bank performance with exception of web pay which had negative impact on bank net worth. The study recommended that management of banks both in Nigerian and Ghanaian should encourage more investments in AI-powered smart ATMs aimed towards delivering more secured banking services in order to increase revenue, discourage excessive queuing in the banking hall, reduced fraud and minimize error in processing transaction. Banks within the scope of this study should leverage on modern technologies to checkmate the excesses of the private operators POS in order to build more confidence on potential customers. Government should convert mobile pay services to a counter terrorism tool by ensuring that restrictions on over-the-counter withdrawals to a minimum amount is maintained and place sanctions on withdrawals above that limit.

Keywords: artificial intelligence (ai), bank performance, automated teller machines (atm), point of sale (pos)

Procedia PDF Downloads 17
17771 Correlation Between Ore Mineralogy and the Dissolution Behavior of K-Feldspar

Authors: Adrian Keith Caamino, Sina Shakibania, Lena Sunqvist-Öqvist, Jan Rosenkranz, Yousef Ghorbani

Abstract:

Feldspar minerals are one of the main components of the earth’s crust. They are tectosilicate, meaning that they mainly contain aluminum and silicon. Besides aluminum and silicon, they contain either potassium, sodium, or calcium. Accordingly, feldspar minerals are categorized into three main groups: K-feldspar, Na-feldspar, and Ca-feldspar. In recent years, the trend to use K-feldspar has grown tremendously, considering its potential to produce potash and alumina. However, the feldspar minerals, in general, are difficult to decompose for the dissolution of their metallic components. Several methods, including intensive milling, leaching under elevated pressure and temperature, thermal pretreatment, and the use of corrosive leaching reagents, have been proposed to improve its low dissolving efficiency. In this study, as part of the POTASSIAL EU project, to overcome the low dissolution efficiency of the K-feldspar components, mechanical activation using intensive milling followed by leaching using hydrochloric acid (HCl) was practiced. Grinding operational parameters, namely time, rotational speed, and ball-to-sample weight ratio, were studied using the Taguchi optimization method. Then, the mineralogy of the grinded samples was analyzed using a scanning electron microscope (SEM) equipped with automated quantitative mineralogy. After grinding, the prepared samples were subjected to HCl leaching. In the end, the dissolution efficiency of the main elements and impurities of different samples were correlated to the mineralogical characterization results. K-feldspar component dissolution is correlated with ore mineralogy, which provides insight into how to best optimize leaching conditions for selective dissolution. Further, it will have an effect on purifying steps taken afterward and the final value recovery procedures

Keywords: K-feldspar, grinding, automated mineralogy, impurity, leaching

Procedia PDF Downloads 80