Search results for: Human-centered automated system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17909

Search results for: Human-centered automated system

17579 A Large Language Model-Driven Method for Automated Building Energy Model Generation

Authors: Yake Zhang, Peng Xu

Abstract:

The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.

Keywords: artificial intelligence, building energy modelling, building simulation, large language model

Procedia PDF Downloads 17
17578 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 322
17577 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 105
17576 New Technologies in Corporate Finance Management in the Digital Economy: Case of Kyrgyzstan

Authors: Marat Kozhomberdiev

Abstract:

The research will investigate the modern corporate finance management technologies currently used in the era of digitalization of the global economy and the degree to which financial institutions are utilizing these new technologies in the field of corporate finance management in Kyrgyzstan. The main purpose of the research is to reveal the role of financial management technologies as joint service centers, intercompany banks, specialized payment centers in the third-world country. Particularly, the analysis of the implacability of automated corporate finance management systems such as enterprise resource planning system (ERP) and treasury management system (TMS) will be carried out. Moreover, the research will investigate the role of cloud accounting systems in corporate finance management in Kyrgyz banks and whether it has any impact on the field of improving corporate finance management. The study will utilize a data collection process via surveying 3 banks in Kyrgyzstan, namely Mol-Bulak, RSK, and KICB. The banks were chosen based on their ownerships, such as state banks, private banks with local authorized capital, and private bank with international capital. The regression analysis will be utilized to reveal the correlation between the ownership of the bank and the use of new financial management technologies. The research will provide policy recommendations to both private and state banks on developing strategies for switching and utilizing modern corporate finance management technologies in their daily operations.

Keywords: digital economy, corporate finance, digital environment, digital technologies, cloud technologies, financial management

Procedia PDF Downloads 68
17575 Automated, Objective Assessment of Pilot Performance in Simulated Environment

Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt

Abstract:

Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).

Keywords: automated assessment, flight simulator, human factors, pilot training

Procedia PDF Downloads 147
17574 A System Functions Set-Up through Near Field Communication of a Smartphone

Authors: Jaemyoung Lee

Abstract:

We present a method to set up system functions through a near filed communication (NFC) of a smartphone. The short communication distance of the NFC which is usually less than 4 cm could prevent any interferences from other devices and establish a secure communication channel between a system and the smartphone. The proposed set-up method for system function values is demonstrated for a blacbox system in a car. In demonstration, system functions of a blackbox which is manipulated through NFC of a smartphone are controls of image quality, sound level, shock sensing level to store images, etc. The proposed set-up method for system function values can be used for any devices with NFC.

Keywords: system set-up, near field communication, smartphone, android

Procedia PDF Downloads 334
17573 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 424
17572 Experimental Study and Evaluation of Farm Environmental Monitoring System Based on the Internet of Things, Sudan

Authors: Farid Eltom A. E., Mustafa Abdul-Halim, Abdalla Markaz, Sami Atta, Mohamed Azhari, Ahmed Rashed

Abstract:

Smart environment sensors integrated with ‘Internet of Things’ (IoT) technology can provide a new concept in tracking, sensing, and monitoring objects in the environment. The aim of the study is to evaluate the farm environmental monitoring system based on (IoT) and to realize the automated management of agriculture and the implementation of precision production. Until now, irrigation monitoring operations in Sudan have been carried out using traditional methods, which is a very costly and unreliable mechanism. However, by utilizing soil moisture sensors, irrigation can be conducted only when needed without fear of plant water stress. The result showed that software application allows farmers to display current and historical data on soil moisture and nutrients in the form of line charts. Design measurements of the soil factors: moisture, electrical, humidity, conductivity, temperature, pH, phosphorus, and potassium; these factors, together with a timestamp, are sent to the data server using the Lora WAN interface. It is considered scientifically agreed upon in the modern era that artificial intelligence works to arrange the necessary procedures to take care of the terrain, predict the quality and quantity of production through deep analysis of the various operations in agricultural fields, and also support monitoring of weather conditions.

Keywords: smart environment, monitoring systems, IoT, LoRa Gateway, center pivot

Procedia PDF Downloads 45
17571 Fake News Detection for Korean News Using Machine Learning Techniques

Authors: Tae-Uk Yun, Pullip Chung, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection using machine learning techniques over the past years. But, there have been no prior studies proposed an automated fake news detection method for Korean news to our best knowledge. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (topic modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as logistic regression, backpropagation network, support vector machine, and deep neural network can be applied. To validate the effectiveness of the proposed method, we collected about 200 short Korean news from Seoul National University’s FactCheck. which provides with detailed analysis reports from 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.

Keywords: fake news detection, Korean news, machine learning, text mining

Procedia PDF Downloads 273
17570 Investor Sentiment and Satisfaction in Automated Investment: A Sentimental Analysis of Robo-Advisor Platforms

Authors: Vertika Goswami, Gargi Sharma

Abstract:

The rapid evolution of fintech has led to the rise of robo-advisor platforms that utilize artificial intelligence (AI) and machine learning to offer personalized investment solutions efficiently and cost-effectively. This research paper conducts a comprehensive sentiment analysis of investor experiences with these platforms, employing natural language processing (NLP) and sentiment classification techniques. The study investigates investor perceptions, engagement, and satisfaction, identifying key drivers of positive sentiment such as clear communication, low fees, consistent returns, and robust security. Conversely, negative sentiment is linked to issues like inconsistent performance, hidden fees, poor customer support, and a lack of transparency. The analysis reveals that addressing these pain points—through improved transparency, enhanced customer service, and ongoing technological advancements—can significantly boost investor trust and satisfaction. This paper contributes valuable insights into the fields of behavioral finance and fintech innovation, offering actionable recommendations for stakeholders, practitioners, and policymakers. Future research should explore the long-term impact of these factors on investor loyalty, the role of emerging technologies, and the effects of ethical investment choices and regulatory compliance on investor sentiment.

Keywords: artificial intelligence in finance, automated investment, financial technology, investor satisfaction, investor sentiment, robo-advisors, sentimental analysis

Procedia PDF Downloads 9
17569 Retrospective Study of Bronchial Secretions Cultures Carried out in the Microbiology Department of General Hospital of Ioannina in 2017

Authors: S. Mantzoukis, M. Gerasimou, P. Christodoulou, N. Varsamis, G. Kolliopoulou, N. Zotos

Abstract:

Purpose: Patients in Intensive Care Units (ICU) are exposed to a different spectrum of microorganisms relative to the hospital. Due to the fact that the majority of these patients are intubated, bronchial secretions should be examined. Material and Method: Bronchial secretions should be taken with care so as not to be mixed with sputum or saliva. The bronchial secretions are placed in a sterile container and then inoculated into blood, Mac Conkey No2, Chocolate, Mueller Hinton, Chapman and Saboureaud agar. After this period, if any number of microbial colonies are detected, gram staining is performed and then the isolated organisms are identified by biochemical techniques in the automated Microscan system (Siemens) followed by a sensitivity test in the same system using the minimum inhibitory concentration MIC technique. The sensitivity test is verified by a Kirby Bauer test. Results: In 2017 the Laboratory of Microbiology received 365 samples of bronchial secretions from the Intensive Care Unit. 237 were found positive. S. epidermidis was identified in 1 specimen, A. baumannii in 60, K. pneumoniae in 42, P. aeruginosa in 50, C. albicans in 40, P. mirabilis in 4, E. coli in 4, S. maltophilia in 6, S. marcescens in 6, S. aureus in 12, S. pneumoniae in 1, S. haemolyticus in 4, P. fluorescens in 1, E. aerogenes in 1, E. cloacae in 5. Conclusions: The majority of ICU patients appear to be a fertile ground for the development of infections. The nature of the findings suggests that a significant part of the bacteria found comes from the unit (nosocomial infection).

Keywords: bronchial secretions, cultures, infections, intensive care units

Procedia PDF Downloads 184
17568 Defining a Reference Architecture for Predictive Maintenance Systems: A Case Study Using the Microsoft Azure IoT-Cloud Components

Authors: Walter Bernhofer, Peter Haber, Tobias Mayer, Manfred Mayr, Markus Ziegler

Abstract:

Current preventive maintenance measures are cost intensive and not efficient. With the available sensor data of state of the art internet of things devices new possibilities of automated data processing emerge. Current advances in data science and in machine learning enable new, so called predictive maintenance technologies, which empower data scientists to forecast possible system failures. The goal of this approach is to cut expenses in preventive maintenance by automating the detection of possible failures and to improve efficiency and quality of maintenance measures. Additionally, a centralization of the sensor data monitoring can be achieved by using this approach. This paper describes the approach of three students to define a reference architecture for a predictive maintenance solution in the internet of things domain with a connected smartphone app for service technicians. The reference architecture is validated by a case study. The case study is implemented with current Microsoft Azure cloud technologies. The results of the case study show that the reference architecture is valid and can be used to achieve a system for predictive maintenance execution with the cloud components of Microsoft Azure. The used concepts are technology platform agnostic and can be reused in many different cloud platforms. The reference architecture is valid and can be used in many use cases, like gas station maintenance, elevator maintenance and many more.

Keywords: case study, internet of things, predictive maintenance, reference architecture

Procedia PDF Downloads 243
17567 Steady State Analysis of Distribution System with Wind Generation Uncertainity

Authors: Zakir Husain, Neem Sagar, Neeraj Gupta

Abstract:

Due to the increased penetration of renewable energy resources in the distribution system, the system is no longer passive in nature. In this paper, a steady state analysis of the distribution system has been done with the inclusion of wind generation. The modeling of wind turbine generator system and wind generator has been made to obtain the average active and the reactive power injection into the system. The study has been conducted on a IEEE-33 bus system with two wind generators. The present research work is useful not only to utilities but also to customers.

Keywords: distributed generation, distribution network, radial network, wind turbine generating system

Procedia PDF Downloads 401
17566 Comparison of Growth Medium Efficiency into Stevia (Stevia rebaudiana Bertoni) Shoot Biomass and Stevioside Content in Thin-Layer System, TIS RITA® Bioreactor, and Bubble Column Bioreactor

Authors: Nurhayati Br Tarigan, Rizkita Rachmi Esyanti

Abstract:

Stevia (Stevia rebaudiana Bertoni) has a great potential to be used as a natural sweetener because it contains steviol glycoside, which is approximately 100 - 300 times sweeter than sucrose, yet low calories. Vegetative and generative propagation of S. rebaudiana is inefficient to produce stevia biomass and stevioside. One of alternative for stevia propagation is in vitro shoot culture. This research was conducted to optimize the best medium for shoot growth and to compare the bioconversion efficiency and stevioside production of S. rebaudiana shoot culture cultivated in thin layer culture (TLC), recipient for automated temporary immersion system (TIS RITA®) bioreactor, and bubble column bioreactor. The result showed that 1 ppm of Kinetin produced a healthy shoot and the highest number of leaves compared to BAP. Shoots were then cultivated in TLC, TIS RITA® bioreactor, and bubble column bioreactor. Growth medium efficiency was determined by yield and productivity. TLC produced the highest growth medium efficiency of S. rebaudiana, the yield was 0.471 ± 0.117 gbiomass.gsubstrate-1, and the productivity was 0.599 ± 0.122 gbiomass.Lmedium-1.day-1. While TIS RITA® bioreactor produced the lowest yield and productivity, 0.182 ± 0.024 gbiomass.gsubstrate-1 and 0.041 ± 0.0002 gbiomass.Lmedium-1.day-1 respectively. The yield of bubble column bioreactor was 0.354 ± 0.204 gbiomass.gsubstrate-1 and the productivity was 0,099 ± 0,009 gbiomass.Lmedium-1.day-1. The stevioside content from the highest to the lowest was obtained from stevia shoot which was cultivated on TLC, TIS RITA® bioreactor, and bubble column bioreactor; the content was 93,44 μg/g, 42,57 μg/g, and 23,03 μg/g respectively. All three systems could be used to produce stevia shoot biomass, but optimization on the number of nutrition and oxygen intake was required in each system.

Keywords: bubble column, growth medium efficiency, Stevia rebaudiana, stevioside, TIS RITA®, TLC

Procedia PDF Downloads 264
17565 Way to Successful Enterprise Resource Planning System Implementation in Developing Countries: Case of Public Sector Unit

Authors: Suraj Kumar Mukti

Abstract:

Enterprise Resource Planning (ERP) system is a management tool to integrate all departments in an organization. It integrates business processes, manages resources efficiently and provides an appropriate decision support system to management. ERP system implementation is a typical and time taking process as well as money consuming process. Articles related to key success factors of ERP system implementation are available in the literature, but rare authors have focused on roadmap of successful ERP system implementation. Postponement is better if the organization is not ready to implement ERP system in better way; hence checking of organization’s preparation to adopt new system is an important prerequisite to ensure the success of ERP system implementation in an organization. Then comes what will be called as success of ERP system implementation. Benefits achieved by ERP system may be categorized into two categories; viz. tangible and intangible benefits. This research article presents a roadmap to ensure the success of ERP system implementation and benefits achieved through the new system as in success indicator. A case study is presented to evaluate the success and benefit achieved through the new system. The article gives a comprehensive approach to academicians and a roadmap to the organizations seeking to implement the ERP system.

Keywords: ERP system, decision support system, tangible, intangible

Procedia PDF Downloads 326
17564 A Tool for Rational Assessment of Dynamic Trust in Networked Organizations

Authors: Simon Samwel Msanjila

Abstract:

Networked environments which provides platforms and environments for business organizations are configured in different forms depending on many factors including life time, member characteristics, communication structure, and business objectives, among others. With continuing advances in digital technologies the distance has become a less barrier for business minded collaboration among organizations. With the need and ease to make business collaborate nowadays organizations are sometimes forced to co-work with others that are either unknown or less known to them in terms of history and performance. A promising approach for sustaining established collaboration has been establishment of trust relationship among organizations based on assessed trustworthiness for each participating organization. It has been stated in research that trust in organization is dynamic and thus assessment of trust level must address such dynamic nature. This paper assess relevant aspects of trust and applies the concepts to propose a semi-automated system for assessing the Sustainability and Evolution of trust in organizations participating in specific objective in a networked organizations environment.

Keywords: trust evolution, trust sustainability, networked organizations, dynamic trust

Procedia PDF Downloads 425
17563 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition

Procedia PDF Downloads 150
17562 Correlation Between Ore Mineralogy and the Dissolution Behavior of K-Feldspar

Authors: Adrian Keith Caamino, Sina Shakibania, Lena Sunqvist-Öqvist, Jan Rosenkranz, Yousef Ghorbani

Abstract:

Feldspar minerals are one of the main components of the earth’s crust. They are tectosilicate, meaning that they mainly contain aluminum and silicon. Besides aluminum and silicon, they contain either potassium, sodium, or calcium. Accordingly, feldspar minerals are categorized into three main groups: K-feldspar, Na-feldspar, and Ca-feldspar. In recent years, the trend to use K-feldspar has grown tremendously, considering its potential to produce potash and alumina. However, the feldspar minerals, in general, are difficult to decompose for the dissolution of their metallic components. Several methods, including intensive milling, leaching under elevated pressure and temperature, thermal pretreatment, and the use of corrosive leaching reagents, have been proposed to improve its low dissolving efficiency. In this study, as part of the POTASSIAL EU project, to overcome the low dissolution efficiency of the K-feldspar components, mechanical activation using intensive milling followed by leaching using hydrochloric acid (HCl) was practiced. Grinding operational parameters, namely time, rotational speed, and ball-to-sample weight ratio, were studied using the Taguchi optimization method. Then, the mineralogy of the grinded samples was analyzed using a scanning electron microscope (SEM) equipped with automated quantitative mineralogy. After grinding, the prepared samples were subjected to HCl leaching. In the end, the dissolution efficiency of the main elements and impurities of different samples were correlated to the mineralogical characterization results. K-feldspar component dissolution is correlated with ore mineralogy, which provides insight into how to best optimize leaching conditions for selective dissolution. Further, it will have an effect on purifying steps taken afterward and the final value recovery procedures

Keywords: K-feldspar, grinding, automated mineralogy, impurity, leaching

Procedia PDF Downloads 73
17561 Commercial Winding for Superconducting Cables and Magnets

Authors: Glenn Auld Knierim

Abstract:

Automated robotic winding of high-temperature superconductors (HTS) addresses precision, efficiency, and reliability critical to the commercialization of products. Today’s HTS materials are mature and commercially promising but require manufacturing attention. In particular to the exaggerated rectangular cross-section (very thin by very wide), winding precision is critical to address the stress that can crack the fragile ceramic superconductor (SC) layer and destroy the SC properties. Damage potential is highest during peak operations, where winding stress magnifies operational stress. Another challenge is operational parameters such as magnetic field alignment affecting design performance. Winding process performance, including precision, capability for geometric complexity, and efficient repeatability, are required for commercial production of current HTS. Due to winding limitations, current HTS magnets focus on simple pancake configurations. HTS motors, generators, MRI/NMR, fusion, and other projects are awaiting robotic wound solenoid, planar, and spherical magnet configurations. As with conventional power cables, full transposition winding is required for long length alternating current (AC) and pulsed power cables. Robotic production is required for transposition, periodic swapping of cable conductors, and placing into precise positions, which allows power utility required minimized reactance. A full transposition SC cable, in theory, has no transmission length limits for AC and variable transient operation due to no resistance (a problem with conventional cables), negligible reactance (a problem for helical wound HTS cables), and no long length manufacturing issues (a problem with both stamped and twisted stacked HTS cables). The Infinity Physics team is solving manufacturing problems by developing automated manufacturing to produce the first-ever reliable and utility-grade commercial SC cables and magnets. Robotic winding machines combine mechanical and process design, specialized sense and observer, and state-of-the-art optimization and control sequencing to carefully manipulate individual fragile SCs, especially HTS, to shape previously unattainable, complex geometries with electrical geometry equivalent to commercially available conventional conductor devices.

Keywords: automated winding manufacturing, high temperature superconductor, magnet, power cable

Procedia PDF Downloads 137
17560 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use

Authors: Mayank Mundhra, Chester Rebeiro

Abstract:

Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.

Keywords: Ripple, Kelips, unique node list, consensus, information propagation

Procedia PDF Downloads 140
17559 Iterative Method for Lung Tumor Localization in 4D CT

Authors: Sarah K. Hagi, Majdi Alnowaimi

Abstract:

In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.

Keywords: automated algorithm , computed tomography, lung tumor, tumor localization

Procedia PDF Downloads 599
17558 Representation of the Solution of One Dynamical System on the Plane

Authors: Kushakov Kholmurodjon, Muhammadjonov Akbarshox

Abstract:

This present paper is devoted to a system of second-order nonlinear differential equations with a special right-hand side, exactly, the linear part and a third-order polynomial of a special form. It is shown that for some relations between the parameters, there is a second-order curve in which trajectories leaving the points of this curve remain in the same place. Thus, the curve is invariant with respect to the given system. Moreover, this system is invariant under a non-degenerate linear transformation of variables. The form of this curve, depending on the relations between the parameters and the eigenvalues of the matrix, is proved. All solutions of this system of differential equations are shown analytically.

Keywords: dynamic system, ellipse, hyperbola, Hess system, polar coordinate system

Procedia PDF Downloads 191
17557 Sensor Data Analysis for a Large Mining Major

Authors: Sudipto Shanker Dasgupta

Abstract:

One of the largest mining companies wanted to look at health analytics for their driverless trucks. These trucks were the key to their supply chain logistics. The automated trucks had multi-level sub-assemblies which would send out sensor information. The use case that was worked on was to capture the sensor signal from the truck subcomponents and analyze the health of the trucks from repair and replacement purview. Open source software was used to stream the data into a clustered Hadoop setup in Amazon Web Services cloud and Apache Spark SQL was used to analyze the data. All of this was achieved through a 10 node amazon 32 core, 64 GB RAM setup real-time analytics was achieved on ‘300 million records’. To check the scalability of the system, the cluster was increased to 100 node setup. This talk will highlight how Open Source software was used to achieve the above use case and the insights on the high data throughput on a cloud set up.

Keywords: streaming analytics, data science, big data, Hadoop, high throughput, sensor data

Procedia PDF Downloads 401
17556 From Linear to Nonlinear Deterrence: Deterrence for Rising Power

Authors: Farhad Ghasemi

Abstract:

Along with transforming the international system into a complex and chaotic system, the fundamental question arises: how can deterrence be reconstructed conceptually and theoretically in this system model? The deterrence system is much more complex today than it was seven decades ago. This article suggests that the perception of deterrence as a linear system is a fundamental mistake because it does not consider the new dynamics of the international system, including network power dynamics. The author aims to improve this point by focusing on complexity and chaos theories, especially their nonlinearity and cascading failure principles. This article proposes that the perception of deterrence as a linear system is a fundamental mistake, as the new dynamics of the surrounding international system do not take into account. The author recognizes deterrence as a nonlinear system and introduces it as a concept in strategic studies.

Keywords: complexity, international system, deterrence, linear deterrence, nonlinear deterrence

Procedia PDF Downloads 136
17555 A Semi-Automated GIS-Based Implementation of Slope Angle Design Reconciliation Process at Debswana Jwaneng Mine, Botswana

Authors: K. Mokatse, O. M. Barei, K. Gabanakgosi, P. Matlhabaphiri

Abstract:

The mining of pit slopes is often associated with some level of deviation from design recommendations, and this may translate to associated changes in the stability of the excavated pit slopes. Therefore slope angle design reconciliations are essential for assessing and monitoring compliance of excavated pit slopes to accepted slope designs. These associated changes in slope stability may be reflected by changes in the calculated factors of safety and/or probabilities of failure. Reconciliations of as-mined and slope design profiles are conducted periodically to assess the implications of these deviations on pit slope stability. Currently, the slope design reconciliation process being implemented in Jwaneng Mine involves the measurement of as-mined and design slope angles along vertical sections cut along the established geotechnical design section lines on the GEOVIA GEMS™ software. Bench retentions are calculated as a percentage of the available catchment area, less over-mined and under-mined areas, to that of the designed catchment area. This process has proven to be both tedious and requires a lot of manual effort and time to execute. Consequently, a new semi-automated mine-to-design reconciliation approach that utilizes laser scanning and GIS-based tools is being proposed at Jwaneng Mine. This method involves high-resolution scanning of targeted bench walls, subsequent creation of 3D surfaces from point cloud data and the derivation of slope toe lines and crest lines on the Maptek I-Site Studio software. The toe lines and crest lines are then exported to the ArcGIS software where distance offsets between the design and actual bench toe lines and crest lines are calculated. Retained bench catchment capacity is measured as distances between the toe lines and crest lines on the same bench elevations. The assessment of the performance of the inter-ramp and overall slopes entails the measurement of excavated and design slope angles along vertical sections on the ArcGIS software. Excavated and design toe-to-toe or crest-to-crest slope angles are measured for inter-ramp stack slope reconciliations. Crest-to-toe slope angles are also measured for overall slope angle design reconciliations. The proposed approach allows for a more automated, accurate, quick and easier workflow for carrying out slope angle design reconciliations. This process has proved highly effective and timeous in the assessment of slope performance in Jwaneng Mine. This paper presents a newly proposed process for assessing compliance to slope angle designs for Jwaneng Mine.

Keywords: slope angle designs, slope design recommendations, slope performance, slope stability

Procedia PDF Downloads 228
17554 Challenges in the Characterization of Black Mass in the Recovery of Graphite from Spent Lithium Ion Batteries

Authors: Anna Vanderbruggen, Kai Bachmann, Martin Rudolph, Rodrigo Serna

Abstract:

Recycling of lithium-ion batteries has attracted a lot of attention in recent years and focuses primarily on valuable metals such as cobalt, nickel, and lithium. Despite the growth in graphite consumption and the fact that it is classified as a critical raw material in the European Union, USA, and Australia, there is little work focusing on graphite recycling. Thus, graphite is usually considered waste in recycling treatments, where graphite particles are concentrated in the “black mass”, a fine fraction below 1mm, which also contains the foils and the active cathode particles such as LiCoO2 or LiNiMnCoO2. To characterize the material, various analytical methods are applied, including X-Ray Fluorescence (XRF), X-Ray Diffraction (XRD), Atomic Absorption Spectrometry (AAS), and SEM-based automated mineralogy. The latter consists of the combination of a scanning electron microscopy (SEM) image analysis and energy-dispersive X-ray spectroscopy (EDS). It is a powerful and well-known method for primary material characterization; however, it has not yet been applied to secondary material such as black mass, which is a challenging material to analyze due to fine alloy particles and to the lack of an existing dedicated database. The aim of this research is to characterize the black mass depending on the metals recycling process in order to understand the liberation mechanisms of the active particles from the foils and their effect on the graphite particle surfaces and to understand their impact on the subsequent graphite flotation. Three industrial processes were taken into account: purely mechanical, pyrolysis-mechanical, and mechanical-hydrometallurgy. In summary, this article explores various and common challenges for graphite and secondary material characterization.

Keywords: automated mineralogy, characterization, graphite, lithium ion battery, recycling

Procedia PDF Downloads 245
17553 Two Years Retrospective Study of Body Fluid Cultures Obtained from Patients in the Intensive Care Unit of General Hospital of Ioannina

Authors: N. Varsamis, M. Gerasimou, P. Christodoulou, S. Mantzoukis, G. Kolliopoulou, N. Zotos

Abstract:

Purpose: Body fluids (pleural, peritoneal, synovial, pericardial, cerebrospinal) are an important element in the detection of microorganisms. For this reason, it is important to examine them in the Intensive Care Unit (ICU) patients. Material and Method: Body fluids are transported through sterile containers and enriched as soon as possible with Tryptic Soy Broth (TSB). After one day of incubation, the broth is poured into selective media: Blood, Mac Conkey No. 2, Chocolate, Mueller Hinton, Chapman and Saboureaud agar. The above selective media are incubated directly for 2 days. After this period, if any number of microbial colonies are detected, gram staining is performed. After that, the isolated organisms are identified by biochemical techniques in the automated Microscan system (Siemens) and followed by a sensitivity test on the same system using the minimum inhibitory concentration MIC technique. The sensitivity test is verified by Kirby Bauer-based plate test. Results: In 2017 the Laboratory of Microbiology received 60 samples of body fluids from the ICU. More specifically the Microbiology Department received 6 peritoneal fluid specimens, 18 pleural fluid specimens and 36 cerebrospinal fluid specimens. 36 positive cultures were tested. S. epidermidis was identified in 18 specimens, S. haemolyticus in 6, and E. faecium in 12. Conclusions: The results show low detection of microorganisms in body fluid cultures.

Keywords: body fluids, culture, intensive care unit, microorganisms

Procedia PDF Downloads 199
17552 A Performance Study of a Solar Heating System on the Microclimate of an Agricultural Greenhouse

Authors: Nora Arbaoui, Rachid Tadili

Abstract:

This study focuses on a solar system designed to heat an agricultural greenhouse. This solar system is based on the heating of a transfer fluid that circulates inside the greenhouse through a solar copper coil integrated into the roof of the greenhouse. The thermal energy stored during the day will be released during the night to improve the microclimate of the greenhouse. This system was tested in a small agricultural greenhouse in order to ameliorate the different operational parameters. The climatic and agronomic results obtained with this system are significant in comparison with a greenhouse with no heating system.

Keywords: solar system, agricultural greenhouse, heating, storage, drying

Procedia PDF Downloads 80
17551 The Impact of System and Data Quality on Organizational Success in the Kingdom of Bahrain

Authors: Amal M. Alrayes

Abstract:

Data and system quality play a central role in organizational success, and the quality of any existing information system has a major influence on the effectiveness of overall system performance.Given the importance of system and data quality to an organization, it is relevant to highlight their importance on organizational performance in the Kingdom of Bahrain. This research aims to discover whether system quality and data quality are related, and to study the impact of system and data quality on organizational success. A theoretical model based on previous research is used to show the relationship between data and system quality, and organizational impact. We hypothesize, first, that system quality is positively associated with organizational impact, secondly that system quality is positively associated with data quality, and finally that data quality is positively associated with organizational impact. A questionnaire was conducted among public and private organizations in the Kingdom of Bahrain. The results show that there is a strong association between data and system quality, that affects organizational success.

Keywords: data quality, performance, system quality, Kingdom of Bahrain

Procedia PDF Downloads 489
17550 Automated Pothole Detection Using Convolution Neural Networks and 3D Reconstruction Using Stereovision

Authors: Eshta Ranyal, Kamal Jain, Vikrant Ranyal

Abstract:

Potholes are a severe threat to road safety and a major contributing factor towards road distress. In the Indian context, they are a major road hazard. Timely detection of potholes and subsequent repair can prevent the roads from deteriorating. To facilitate the roadway authorities in the timely detection and repair of potholes, we propose a pothole detection methodology using convolutional neural networks. The YOLOv3 model is used as it is fast and accurate in comparison to other state-of-the-art models. You only look once v3 (YOLOv3) is a state-of-the-art, real-time object detection system that features multi-scale detection. A mean average precision(mAP) of 73% was obtained on a training dataset of 200 images. The dataset was then increased to 500 images, resulting in an increase in mAP. We further calculated the depth of the potholes using stereoscopic vision by reconstruction of 3D potholes. This enables calculating pothole volume, its extent, which can then be used to evaluate the pothole severity as low, moderate, high.

Keywords: CNN, pothole detection, pothole severity, YOLO, stereovision

Procedia PDF Downloads 133