Search results for: automated workflow
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 997

Search results for: automated workflow

637 Grid Pattern Recognition and Suppression in Computed Radiographic Images

Authors: Igor Belykh

Abstract:

Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.

Keywords: grid, computed radiography, pattern recognition, image processing, filtering

Procedia PDF Downloads 256
636 A Comparative Study of Medical Image Segmentation Methods for Tumor Detection

Authors: Mayssa Bensalah, Atef Boujelben, Mouna Baklouti, Mohamed Abid

Abstract:

Image segmentation has a fundamental role in analysis and interpretation for many applications. The automated segmentation of organs and tissues throughout the body using computed imaging has been rapidly increasing. Indeed, it represents one of the most important parts of clinical diagnostic tools. In this paper, we discuss a thorough literature review of recent methods of tumour segmentation from medical images which are briefly explained with the recent contribution of various researchers. This study was followed by comparing these methods in order to define new directions to develop and improve the performance of the segmentation of the tumour area from medical images.

Keywords: features extraction, image segmentation, medical images, tumor detection

Procedia PDF Downloads 141
635 Northern Nigeria Vaccine Direct Delivery System

Authors: Evelyn Castle, Adam Thompson

Abstract:

Background: In 2013, the Kano State Primary Health Care Management Board redesigned its Routine immunization supply chain from diffused pull to direct delivery push. It addressed issues around stockouts and reduced time spent by health facility staff collecting, and reporting on vaccine usage. The health care board sought the help of a 3PL for twice-monthly deliveries from its cold store to 484 facilities across 44 local governments. eHA’s Health Delivery Systems group formed a 3PL to serve 326 of these new facilities in partnership with the State. We focused on designing and implementing a technology system throughout. Basic methodologies: GIS Mapping: - Planning the delivery of vaccines to hundreds of health facilities requires detailed route planning for delivery vehicles. Mapping the road networks across Kano and Bauchi with a custom routing tool provided information for the optimization of deliveries. Reducing the number of kilometers driven each round by 20%, - reducing cost and delivery time. Direct Delivery Information System: - Vaccine Direct Deliveries are facilitated through pre-round planning (driven by health facility database, extensive GIS, and inventory workflow rules), manager and driver control panel customizing delivery routines and reporting, progress dashboard, schedules/routes, packing lists, delivery reports, and driver data collection applications. Move: Last Mile Logistics Management System: - MOVE has improved vaccine supply information management to be timely, accurate and actionable. Provides stock management workflow support, alerts management for cold chain exceptions/stock outs, and on-device analytics for health and supply chain staff. Software was built to be offline-first with user-validated interface and experience. Deployed to hundreds of vaccine storage site the improved information tools helps facilitate the process of system redesign and change management. Findings: - Stock-outs reduced from 90% to 33% - Redesigned current health systems and managing vaccine supply for 68% of Kano’s wards. - Near real time reporting and data availability to track stock. - Paperwork burdens of health staff have been dramatically reduced. - Medicine available when the community needs it. - Consistent vaccination dates for children under one to prevent polio, yellow fever, tetanus. - Higher immunization rates = Lower infection rates. - Hundreds of millions of Naira worth of vaccines successfully transported. - Fortnightly service to 326 facilities in 326 wards across 30 Local Government areas. - 6,031 cumulative deliveries. - Over 3.44 million doses transported. - Minimum travel distance covered in a round of delivery is 2000 kms & maximum of 6297 kms. - 153,409 kms travelled by 6 drivers. - 500 facilities in 326 wards. - Data captured and synchronized for the first time. - Data driven decision making now possible. Conclusion: eHA’s Vaccine Direct delivery has met challenges in Kano and Bauchi State and provided a reliable delivery service of vaccinations that ensure t health facilities can run vaccination clinics for children under one. eHA uses innovative technology that delivers vaccines from Northern Nigerian zonal stores straight to healthcare facilities. Helped healthcare workers spend less time managing supplies and more time delivering care, and will be rolled out nationally across Nigeria.

Keywords: direct delivery information system, health delivery system, GIS mapping, Northern Nigeria, vaccines

Procedia PDF Downloads 344
634 A Study of Issues and Mitigations on Distributed Denial of Service and Medical Internet of Things Devices

Authors: Robin Singh, Jing-Chiou Liou

Abstract:

The Internet of Things (IoT) devices are being used heavily as part of our everyday routines. Through improved communication and automated procedures, its popularity has assisted users in raising the quality of work. These devices are used in healthcare in order to better collect the patient’s data for their treatment. They are generally considered safe and secure. However, there is some possibility that some loopholes do exist which manufacturers do need to identify before some hacker takes advantage of them. For this study, we focused on two medical IoT devices which are pacemakers and hearing aids. The aim of this paper is to identify if there is any likelihood of these medical devices being hijacked and used as a botnet in Distributed Denial-Of Service attacks. Moreover, some mitigation strategies are being proposed to better secure

Keywords: cybersecurity, DDoS, IoT, medical devices

Procedia PDF Downloads 58
633 Increasing Health Education Tools Satisfaction in Nursing Staffs

Authors: Lu Yu Jyun

Abstract:

Background: Health education is important nursing work aiming to strengthen patients’ self-caring ability and family members. Our department educates through three methods, including speech education, flyer and demonstration video education. The satisfaction rate of health education tool use is 54.3% in nursing staff. The main reason is there hadn’t been a storage area for flyers, causing extra workload in assessing flyers. The satisfaction rate of health education in patients and families is 70.7%. We aim to improve this situation between 13th April and 6th June 2021. Method: We introduce the ECRS method to erase repetitive and redundant actions. We redesign the health education tool usage workflow to improve nursing staffs’ efficiency and further enhance nursing staffs care quality and working satisfaction. Result: The satisfaction rate of health education tool usage in nursing staff elevated from 54.3% to 92.5%. The satisfaction rate of health education in patients and families elevated from 70.7% to 90.2%. Conclusion: The assessment time of health care tools dropped from 10minutes to 3minutes. This significantly reduced the nursing staffs’ workload. 1213 paper is saved in one month and 14,556 a year in the estimate; we save the environment via this action. Health education map implemented in other nursing departments since October due to its’ high efficiency and makes health care tools more humanize.

Keywords: health, education tools, satisfaction, nursing staff

Procedia PDF Downloads 123
632 Business Intelligence Proposal to Improve Decision Making in Companies Using Google Cloud Platform and Microsoft Power BI

Authors: Joel Vilca Tarazona, Igor Aguilar-Alonso

Abstract:

The problem of this research related to business intelligence is the lack of a tool that supports automated and efficient financial analysis for decision-making and allows an evaluation of the financial statements, which is why the availability of the information is difficult. Relevant information to managers and users as an instrument in decision making financial, and administrative. For them, a business intelligence solution is proposed that will reduce information access time, personnel costs, and process automation, proposing a 4-layer architecture based on what was reviewed by the research methodology.

Keywords: decision making, business intelligence, Google Cloud, Microsoft Power BI

Procedia PDF Downloads 76
631 The Platform for Digitization of Georgian Documents

Authors: Erekle Magradze, Davit Soselia, Levan Shughliashvili, Irakli Koberidze, Shota Tsiskaridze, Victor Kakhniashvili, Tamar Chaghiashvili

Abstract:

Since the beginning of active publishing activity in Georgia, voluminous printed material has been accumulated, the digitization of which is an important task. Digitized materials will be available to the audience, and it will be possible to find text in them and conduct various factual research. Digitizing scanned documents means scanning documents, extracting text from the scanned documents, and processing the text into a corresponding language model to detect inaccuracies and grammatical errors. Implementing these stages requires a unified, scalable, and automated platform, where the digital service developed for each stage will perform the task assigned to it; at the same time, it will be possible to develop these services dynamically so that there is no interruption in the work of the platform.

Keywords: NLP, OCR, BERT, Kubernetes, transformers

Procedia PDF Downloads 120
630 Formex Algebra Adaptation into Parametric Design Tools: Dome Structures

Authors: Réka Sárközi, Péter Iványi, Attila B. Széll

Abstract:

The aim of this paper is to present the adaptation of the dome construction tool for formex algebra to the parametric design software Grasshopper. Formex algebra is a mathematical system, primarily used for planning structural systems such like truss-grid domes and vaults, together with the programming language Formian. The goal of the research is to allow architects to plan truss-grid structures easily with parametric design tools based on the versatile formex algebra mathematical system. To produce regular structures, coordinate system transformations are used and the dome structures are defined in spherical coordinate system. Owing to the abilities of the parametric design software, it is possible to apply further modifications on the structures and gain special forms. The paper covers the basic dome types, and also additional dome-based structures using special coordinate-system solutions based on spherical coordinate systems. It also contains additional structural possibilities like making double layer grids in all geometry forms. The adaptation of formex algebra and the parametric workflow of Grasshopper together give the possibility of quick and easy design and optimization of special truss-grid domes.

Keywords: parametric design, structural morphology, space structures, spherical coordinate system

Procedia PDF Downloads 223
629 Music Listening in Dementia: Current Developments and the Potential for Automated Systems in the Home: Scoping Review and Discussion

Authors: Alexander Street, Nina Wollersberger, Paul Fernie, Leonardo Muller, Ming Hung HSU, Helen Odell-Miller, Jorg Fachner, Patrizia Di Campli San Vito, Stephen Brewster, Hari Shaji, Satvik Venkatesh, Paolo Itaborai, Nicolas Farina, Alexis Kirke, Sube Banerjee, Eduardo Reck Miranda

Abstract:

Escalating neuropsychiatric symptoms (NPS) in people with dementia may lead to earlier care home admission. Music listening has been reported to stimulate cognitive function, potentially reducing agitation in this population. We present a scoping review, reporting on current developments and discussing the potential for music listening with related technology in managing agitation in dementia care. Of two searches for music listening studies, one focused on older people or people living with dementia where music listening interventions, including technology, were delivered in participants’ homes or in institutions to address neuropsychiatric symptoms, quality of life and independence. The second included any population focusing on the use of music technology for health and wellbeing. In search one 70/251 full texts were included. The majority reported either statistical significance (6, 8.5%), significance (17, 24.2%) or improvements (26, 37.1%). Agitation was specifically reported in 36 (51.4%). The second search included 51/99 full texts, reporting improvement (28, 54.9%), significance (11, 21.5%), statistical significance (1, 1.9%) and no difference compared to the control (6, 11.7%). The majority in the first focused on mood and agitation, and the second on mood and psychophysiological responses. Five studies used AI or machine learning systems to select music, all involving healthy controls and reporting benefits. Most studies in both reviews were not conducted in a home environment (review 1 = 12; 17.1%; review 2 = 11; 21.5%). Preferred music listening may help manage NPS in the care home settings. Based on these and other data extracted in the review, a reasonable progression would be to co-design and test music listening systems and protocols for NPS in all settings, including people’s homes. Machine learning and automated technology for music selection and arousal adjustment, driven by live biodata, have not been explored in dementia care. Such approaches may help deliver the right music at the appropriate time in the required dosage, reducing the use of medication and improving quality of life.

Keywords: music listening, dementia, agitation, scoping review, technology

Procedia PDF Downloads 79
628 Building and Tree Detection Using Multiscale Matched Filtering

Authors: Abdullah H. Özcan, Dilara Hisar, Yetkin Sayar, Cem Ünsalan

Abstract:

In this study, an automated building and tree detection method is proposed using DSM data and true orthophoto image. A multiscale matched filtering is used on DSM data. Therefore, first watershed transform is applied. Then, Otsu’s thresholding method is used as an adaptive threshold to segment each watershed region. Detected objects are masked with NDVI to separate buildings and trees. The proposed method is able to detect buildings and trees without entering any elevation threshold. We tested our method on ISPRS semantic labeling dataset and obtained promising results.

Keywords: building detection, local maximum filtering, matched filtering, multiscale

Procedia PDF Downloads 298
627 Profiling Risky Code Using Machine Learning

Authors: Zunaira Zaman, David Bohannon

Abstract:

This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.

Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties

Procedia PDF Downloads 79
626 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 25
625 Design and Development of Motorized Placer for Balloon Uterine Stents in Gynecology

Authors: Metehan Mutlu, Meltem Elitas

Abstract:

This study aims to provide an automated method for placing the balloon uterine stents after hysteroscopy adhesiolysis. Currently, there are no automatized tools to place the balloon uterine stent; therefore, surgeons into the endometrial cavity manually fit it. However, it is very hard to pass the balloon stent through the cervical canal, which is roughly 10mm after the surgery. Our method aims to provide an effective and practical way of placing the stent, by automating the procedure through our designed device. Furthermore, our device does the required tasks fast compared to traditional methods, reduces the narcosis time, and decreases the bacterial contamination risks.

Keywords: balloon uterine stent, endometrial cavity, hysteroscopy, motorized-tool

Procedia PDF Downloads 257
624 Intelligent Recognition Tools for Industrial Automation

Authors: Amin Nazerzadeh, Afsaneh Nouri Houshyar , Azadeh Noori Hoshyar

Abstract:

With the rapid growing of information technology, the industry and manufacturing systems are becoming more automated. Therefore, achieving the highly accurate automatic systems with reliable security is becoming more critical. Biometrics that refers to identifying individual based on physiological or behavioral traits are unique identifiers provide high reliability and security in different industrial systems. As biometric cannot easily be transferred between individuals or copied, it has been receiving extensive attention. Due to the importance of security applications, this paper provides an overview on biometrics and discuss about background, types and applications of biometric as an effective tool for the industrial applications.

Keywords: Industial and manufacturing applications, intelligence and security, information technology, recognition; security technology; biometrics

Procedia PDF Downloads 131
623 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 288
622 The Exploitation of the MOSES Project Outcomes on Supply Chain Optimisation

Authors: Reza Karimpour

Abstract:

Ports play a decisive role in the EU's external and internal trade, as about 74% of imports and exports and 37% of exchanges go through ports. Although ports, especially Deep Sea Shipping (DSS) ports, are integral nodes within multimodal logistic flows, Short Sea Shipping (SSS) and inland waterways are not so well integrated. The automated vessels and supply chain optimisations for sustainable shortsea shipping (MOSES) project aims to enhance the short sea shipping component of the European supply chain by addressing the vulnerabilities and strains related to the operation of large containerships. The MOSES concept can be shortly described as a large containership (mother-vessel) approaching a DSS port (or a large container terminal). Upon her arrival, a combined intelligent mega-system consisting of the MOSES Autonomous tugboat swarm for manoeuvring and the MOSES adapted AutoMoor system. Then, container handling processes are ready to start moving containers to their destination via hinterland connections (trucks and/or rail) or to be shipped to destinations near small ports (on the mainland or island). For the first case, containers are stored in a dedicated port area (Storage area), waiting to be moved via trucks and/or rail. For the second case, containers are stacked by existing port equipment near-dedicated berths of the DSS port. They then are loaded on the MOSES Innovative Feeder Vessel, equipped with the MOSES Robotic Container-Handling System that provides (semi-) autonomous (un) feeding of the feeder. The Robotic Container-Handling System is remotely monitored through a Shore Control Centre. When the MOSES innovative Feeder vessel approaches the small port, where her docking is achieved without tugboats, she automatically unloads the containers using the Robotic Container-Handling System on the quay or directly on trucks. As a result, ports with minimal or no available infrastructure may be effectively integrated with the container supply chain. Then, the MOSES innovative feeder vessel continues her voyage to the next small port, or she returns to the DSS port. MOSES exploitation activity mainly aims to exploit research outcomes beyond the project, facilitate utilisation of the pilot results by others, and continue the pilot service after the project ends. By the mid-lifetime of the project, the exploitation plan introduces the reader to the MOSES project and its key exploitable results. It provides a plan for delivering the MOSES innovations to the market as part of the overall exploitation plan.

Keywords: automated vessels, exploitation, shortsea shipping, supply chain

Procedia PDF Downloads 85
621 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets

Procedia PDF Downloads 150
620 The Synergistic Effects of Blockchain and AI on Enhancing Data Integrity and Decision-Making Accuracy in Smart Contracts

Authors: Sayor Ajfar Aaron, Sajjat Hossain Abir, Ashif Newaz, Mushfiqur Rahman

Abstract:

Investigating the convergence of blockchain technology and artificial intelligence, this paper examines their synergistic effects on data integrity and decision-making within smart contracts. By implementing AI-driven analytics on blockchain-based platforms, the research identifies improvements in automated contract enforcement and decision accuracy. The paper presents a framework that leverages AI to enhance transparency and trust while blockchain ensures immutable record-keeping, culminating in significantly optimized operational efficiencies in various industries.

Keywords: artificial intelligence, blockchain, data integrity, smart contracts

Procedia PDF Downloads 14
619 Design and Implementation of Agricultural Machinery Equipment Scheduling Platform Based On Case-Based Reasoning

Authors: Wen Li, Zhengyu Bai, Qi Zhang

Abstract:

The demand for smart scheduling platform in agriculture, particularly in the scheduling process of machinery equipment, is high. With the continuous development of agricultural machinery equipment technology, a large number of agricultural machinery equipment and agricultural machinery cooperative service organizations continue to appear in China. The large area of cultivated land and a large number of agricultural activities in the central and western regions of China have made the demand for smart and efficient agricultural machinery equipment scheduling platforms more intense. In this study, we design and implement a platform for agricultural machinery equipment scheduling to allocate agricultural machinery equipment resources reasonably. With agricultural machinery equipment scheduling platform taken as the research object, we discuss its research significance and value, use the service blueprint technology to analyze and characterize the agricultural machinery equipment schedule workflow, the network analytic method to obtain the demand platform function requirements, and divide the platform functions through the platform function division diagram. Simultaneously, based on the case-based reasoning (CBR) algorithm, the equipment scheduling module of the agricultural machinery equipment scheduling platform is realized; finally, a design scheme of the agricultural machinery equipment scheduling platform architecture is provided, and the visualization interface of the platform is established via VB programming language. It provides design ideas and theoretical support for the construction of a modern agricultural equipment information scheduling platform.

Keywords: case-based reasoning, service blueprint, system design, ANP, VB programming language

Procedia PDF Downloads 144
618 Digital Design and Fabrication: A Review of Trend and Its Impact in the African Context

Authors: Mohamed Al Araby, Amany Salman, Mostafa Amin, Mohamed Madbully, Dalia Keraa, Mariam Ali, Marah Abdelfatah, Mariam Ahmed, Ahmed Hassab

Abstract:

In recent years, the architecture, engineering, and construction (A.E.C.) industry have been exposed to important innovations, most notably the global integration of digital design and fabrication (D.D.F.) processes in the industry’s workflow. Despite this evolution in that sector, Africa was excluded from the examination of this development. The reason behind this exclusion is the preconceived view of it as a developing region that still employs traditional methods of construction. The primary objective of this review is to investigate the trend of digital construction (D.C.) in the African environment and the difficulties in its regular utilization of it. This objective can be attained by recognizing the notion of distributed computing in Africa and evaluating the impact of the projects deploying this technology on both the immediate and broader contexts. The paper’s methodology begins with the collection of data from 224 initiatives throughout Africa. Then, 50 of these projects were selected based on the criteria of the project's recency, typology variety, and location diversity. After that, a literature-based comparative analysis was undertaken. This study’s findings reveal a pattern of motivation for applying digital fabrication processes. Moreover, it is essential to evaluate the socio-economic effects of these projects on the population living near the analyzed subject. The last step in this study is identifying the influence on the neighboring nations.

Keywords: Africa, digital construction, digital design, fabrication

Procedia PDF Downloads 130
617 Disaster and Crisis Management Using Geographical Information System (GIS) during the Operation and Maintenance Stages of the Hyderabad Metro Rail in India

Authors: Sai Rajeev Reddy, Ishita Roy, M. Anji Reddy

Abstract:

The paper describes the importance of preventive measures and immediate Emergency logistics during accidents and unfortunate Disasters for the Hyderabad Metro Rails in their various stages of construction. This is the need of the modern generation where accidents, explosions, attacks and sudden crisis are frequent casualties which take huge tolls of life in the present world. The paper utilizes the workflow and application of Geographical information System (GIS) to provide information about problems and crisis structures for efficient Metro Transportation in the city. The study analyzes the difficulties and problems which cause accidents during operation and maintenance stages of the Metro Rail. The paper focuses upon the intermediate and firsthand information of Crisis with the help of GIS technology to share Disaster data for effective measures by the Cyber Police stations, Emergency Responders, Hospitals and First Aid Centre to act immediately and save lives. The results and conclusions have nevertheless proved very informative and useful for the safety board authorities of the Hyderabad Metro Rail. The operation and Maintenance are integral stages in the development of any Multipurpose transportation Projects and are usually prone to various Disasters and tragedies. Hence, the GIS technologies help in distribution of information among the masses with the web Technologies and advanced software developed to prevent and manage crisis widely and in a cost-benefits manner.

Keywords: Geographical Information System, emergency assessment, accident zones, surveillance

Procedia PDF Downloads 540
616 E-Service and the Nigerian Banking Sector: A Review of ATM Architecture and Operations

Authors: Bashir Aliyu Yauri, Rufai Aliyu Yauri

Abstract:

With the introduction of cash-less society policy by the Central Bank of Nigeria, the concept of e-banking services has experienced a significant improvement over the years. Today quite a number of people are embracing e-banking activities especially ATM, thereby moving away from the conventional banking system. This paper presents a review of the underlying Architectural Layout of Intra-Bank and Inter-Bank ATM connectivity in Nigeria. The paper further investigates and discusses factors affecting the Intra-Bank and Inter-Bank ATM connectivity in Nigeria. And as well possible solutions to these factors affecting ATM Connectivity and Operations are proposed.

Keywords: architectural layout, automated teller machine, e-services, postilion

Procedia PDF Downloads 605
615 Bug Localization on Single-Line Bugs of Apache Commons Math Library

Authors: Cherry Oo, Hnin Min Oo

Abstract:

Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.

Keywords: software testing, bug localization, program spectra, bug

Procedia PDF Downloads 118
614 Identification of Arglecins B and C and Actinofuranosin A from a Termite Gut-Associated Streptomyces Species

Authors: Christian A. Romero, Tanja Grkovic, John. R. J. French, D. İpek Kurtböke, Ronald J. Quinn

Abstract:

A high-throughput and automated 1H NMR metabolic fingerprinting dereplication approach was used to accelerate the discovery of unknown bioactive secondary metabolites. The applied dereplication strategy accelerated the discovery of natural products, provided rapid and competent identification and quantification of the known secondary metabolites and avoided time-consuming isolation procedures. The effectiveness of the technique was demonstrated by the isolation and elucidation of arglecins B (1), C (2) and actinofuranosin A (3) from a termite-gut associated Streptomyces sp. (USC 597) grown under solid state fermentation. The structures of these compounds were elucidated by extensive interpretation of 1H, 13C and 2D NMR spectroscopic data. These represent the first report of arglecin analogs isolated from a termite gut-associated Streptomyces species.

Keywords: actinomycetes, actinofuranosin, antibiotics, arglecins, NMR spectroscopy

Procedia PDF Downloads 33
613 Designing AI-Enabled Smart Maintenance Scheduler: Enhancing Object Reliability through Automated Management

Authors: Arun Prasad Jaganathan

Abstract:

In today's rapidly evolving technological landscape, the need for efficient and proactive maintenance management solutions has become increasingly evident across various industries. Traditional approaches often suffer from drawbacks such as reactive strategies, leading to potential downtime, increased costs, and decreased operational efficiency. In response to these challenges, this paper proposes an AI-enabled approach to object-based maintenance management aimed at enhancing reliability and efficiency. The paper contributes to the growing body of research on AI-driven maintenance management systems, highlighting the transformative impact of intelligent technologies on enhancing object reliability and operational efficiency.

Keywords: AI, machine learning, predictive maintenance, object-based maintenance, expert team scheduling

Procedia PDF Downloads 22
612 Integrating Computer-Aided Manufacturing and Computer-Aided Design for Streamlined Carpentry Production in Ghana

Authors: Benson Tette, Thomas Mensah

Abstract:

As a developing country, Ghana has a high potential to harness the economic value of every industry. Two of the industries that produce below capacity are handicrafts (for instance, carpentry) and information technology (i.e., computer science). To boost production and maintain competitiveness, the carpentry sector in Ghana needs more effective manufacturing procedures that are also more affordable. This issue can be resolved using computer-aided manufacturing (CAM) technology, which automates the fabrication process and decreases the amount of time and labor needed to make wood goods. Yet, the integration of CAM in carpentry-related production is rarely explored. To streamline the manufacturing process, this research investigates the equipment and technology that are currently used in the Ghanaian carpentry sector for automated fabrication. The research looks at the various CAM technologies, such as Computer Numerical Control routers, laser cutters, and plasma cutters, that are accessible to Ghanaian carpenters yet unexplored. We also investigate their potential to enhance the production process. To achieve the objective, 150 carpenters, 15 software engineers, and 10 policymakers were interviewed using structured questionnaires. The responses provided by the 175 respondents were processed to eliminate outliers and omissions were corrected using multiple imputations techniques. The processed responses were analyzed through thematic analysis. The findings showed that adaptation and integration of CAD software with CAM technologies would speed up the design-to-manufacturing process for carpenters. It must be noted that achieving such results entails first; examining the capabilities of current CAD software, then determining what new functions and resources are required to improve the software's suitability for carpentry tasks. Responses from both carpenters and computer scientists showed that it is highly practical and achievable to streamline the design-to-manufacturing process through processes such as modifying and combining CAD software with CAM technology. Making the carpentry-software integration program more useful for carpentry projects would necessitate investigating the capabilities of the current CAD software and identifying additional features in the Ghanaian ecosystem and tools that are required. In conclusion, the Ghanaian carpentry sector has a chance to increase productivity and competitiveness through the integration of CAM technology with CAD software. Carpentry companies may lower labor costs and boost production capacity by automating the fabrication process, giving them a competitive advantage. This study offers implementation-ready and representative recommendations for successful implementation as well as important insights into the equipment and technologies available for automated fabrication in the Ghanaian carpentry sector.

Keywords: carpentry, computer-aided manufacturing (CAM), Ghana, information technology(IT)

Procedia PDF Downloads 66
611 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 247
610 Evaluating 8D Reports Using Text-Mining

Authors: Benjamin Kuester, Bjoern Eilert, Malte Stonis, Ludger Overmeyer

Abstract:

Increasing quality requirements make reliable and effective quality management indispensable. This includes the complaint handling in which the 8D method is widely used. The 8D report as a written documentation of the 8D method is one of the key quality documents as it internally secures the quality standards and acts as a communication medium to the customer. In practice, however, the 8D report is mostly faulty and of poor quality. There is no quality control of 8D reports today. This paper describes the use of natural language processing for the automated evaluation of 8D reports. Based on semantic analysis and text-mining algorithms the presented system is able to uncover content and formal quality deficiencies and thus increases the quality of the complaint processing in the long term.

Keywords: 8D report, complaint management, evaluation system, text-mining

Procedia PDF Downloads 282
609 On Musical Information Geometry with Applications to Sonified Image Analysis

Authors: Shannon Steinmetz, Ellen Gethner

Abstract:

In this paper, a theoretical foundation is developed for patterned segmentation of audio using the geometry of music and statistical manifold. We demonstrate image content clustering using conic space sonification. The algorithm takes a geodesic curve as a model estimator of the three-parameter Gamma distribution. The random variable is parameterized by musical centricity and centric velocity. Model parameters predict audio segmentation in the form of duration and frame count based on the likelihood of musical geometry transition. We provide an example using a database of randomly selected images, resulting in statistically significant clusters of similar image content.

Keywords: sonification, musical information geometry, image, content extraction, automated quantification, audio segmentation, pattern recognition

Procedia PDF Downloads 191
608 Testing of Electronic Control Unit Communication Interface

Authors: Petr Šimek, Kamil Kostruk

Abstract:

This paper deals with the problem of testing the Electronic Control Unit (ECU) for the specified function validation. Modern ECUs have many functions which need to be tested. This process requires tracking between the test and the specification. The technique discussed in this paper explores the system for automating this process. The paper focuses in its chapter IV on the introduction to the problem in general, then it describes the proposed test system concept and its principle. It looks at how the process of the ECU interface specification file for automated interface testing and test tracking works. In the end, the future possible development of the project is discussed.

Keywords: electronic control unit testing, embedded system, test generate, test automation, process automation, CAN bus, ethernet

Procedia PDF Downloads 84