Search results for: automated processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4259

Search results for: automated processing

4049 Post-Processing Method for Performance Improvement of Aerial Image Parcel Segmentation

Authors: Donghee Noh, Seonhyeong Kim, Junhwan Choi, Heegon Kim, Sooho Jung, Keunho Park

Abstract:

In this paper, we describe an image post-processing method to enhance the performance of the parcel segmentation method using deep learning-based aerial images conducted in previous studies. The study results were evaluated using a confusion matrix, IoU, Precision, Recall, and F1-Score. In the case of the confusion matrix, it was observed that the false positive value, which is the result of misclassification, was greatly reduced as a result of image post-processing. The average IoU was 0.9688 in the image post-processing, which is higher than the deep learning result of 0.8362, and the F1-Score was also 0.9822 in the image post-processing, which was higher than the deep learning result of 0.8850. As a result of the experiment, it was found that the proposed technique positively complements the deep learning results in segmenting the parcel of interest.

Keywords: aerial image, image process, machine vision, open field smart farm, segmentation

Procedia PDF Downloads 46
4048 Design Off-Campus Interactive Cloud-Based Learning Model

Authors: Osamah Al Qadoori

Abstract:

Using cloud computing in educational sectors grow rapidly in UAE. Initially, within Cloud-Learning Environment Students whenever and wherever can remotely join the online-classroom, on the other hand, Cloud-Based Learning is greatly decreasing the infrastructure and the maintenance cost. Nowadays in many schools (K-12), institutes, colleges as well as universities in UAE Cloud-Based Teaching and Learning environments gain a higher demand and concern. Many students don’t use the available online-educational resources effectively. The challenging question is to which extend these educational resources which are installed in the cloud environment are valuable and constructive? In this paper the researcher is seeking to design an expert agent prototype where the huge information being accommodated inside the cloud environment will go through expert filtration before going to be utilized by other clients (students). To achieve this goal, the focus of the present research would be on two different directions the educational human expertise and the automated-educational expert systems.

Keywords: cloud computing, cloud-learning environment, online-classroom, the educational human expertise, the automated-educational expert systems

Procedia PDF Downloads 515
4047 The Human Process of Trust in Automated Decisions and Algorithmic Explainability as a Fundamental Right in the Exercise of Brazilian Citizenship

Authors: Paloma Mendes Saldanha

Abstract:

Access to information is a prerequisite for democracy while also guiding the material construction of fundamental rights. The exercise of citizenship requires knowing, understanding, questioning, advocating for, and securing rights and responsibilities. In other words, it goes beyond mere active electoral participation and materializes through awareness and the struggle for rights and responsibilities in the various spaces occupied by the population in their daily lives. In times of hyper-cultural connectivity, active citizenship is shaped through ethical trust processes, most often established between humans and algorithms. Automated decisions, so prevalent in various everyday situations, such as purchase preference predictions, virtual voice assistants, reduction of accidents in autonomous vehicles, content removal, resume selection, etc., have already found their place as a normalized discourse that sometimes does not reveal or make clear what violations of fundamental rights may occur when algorithmic explainability is lacking. In other words, technological and market development promotes a normalization for the use of automated decisions while silencing possible restrictions and/or breaches of rights through a culturally modeled, unethical, and unexplained trust process, which hinders the possibility of the right to a healthy, transparent, and complete exercise of citizenship. In this context, the article aims to identify the violations caused by the absence of algorithmic explainability in the exercise of citizenship through the construction of an unethical and silent trust process between humans and algorithms in automated decisions. As a result, it is expected to find violations of constitutionally protected rights such as privacy, data protection, and transparency, as well as the stipulation of algorithmic explainability as a fundamental right in the exercise of Brazilian citizenship in the era of virtualization, facing a threefold foundation called trust: culture, rules, and systems. To do so, the author will use a bibliographic review in the legal and information technology fields, as well as the analysis of legal and official documents, including national documents such as the Brazilian Federal Constitution, as well as international guidelines and resolutions that address the topic in a specific and necessary manner for appropriate regulation based on a sustainable trust process for a hyperconnected world.

Keywords: artificial intelligence, ethics, citizenship, trust

Procedia PDF Downloads 31
4046 Thermo-Mechanical Treatment of Chromium Alloyed Low Carbon Steel

Authors: L. Kučerová, M. Bystrianský, V. Kotěšovec

Abstract:

Thermo-mechanical processing with various processing parameters was applied to 0.2%C-0.6%Mn-2S%i-0.8%Cr low alloyed high strength steel. The aim of the processing was to achieve the microstructures typical for transformation induced plasticity (TRIP) steels. Thermo-mechanical processing used in this work incorporated two or three deformation steps. The deformations were in all the cases carried out during the cooling from soaking temperatures to various bainite hold temperatures. In this way, 4-10% of retained austenite were retained in the final microstructures, consisting further of ferrite, bainite, martensite and pearlite. The complex character of TRIP steel microstructure is responsible for its good strength and ductility. The strengths achieved in this work were in the range of 740 MPa – 836 MPa with ductility A5mm of 31-41%.

Keywords: pearlite, retained austenite, thermo-mechanical treatment, TRIP steel

Procedia PDF Downloads 266
4045 Microstructure and Mechanical Evaluation of PMMA/Al₂O₃ Nanocomposite Fabricated via Friction Stir Processing

Authors: Reham K. El Sawah, N. S. M. El-Tayeb

Abstract:

This study aims to produce a polymer matrix composite reinforced with Al₂O₃ nanoparticles in order to enhance the mechanical properties of PMMA. The composite was fabricated via Friction stir processing to ensure homogenous dispersion of Al₂O₃ nanoparticles in the polymer, and the processing was submerged to prevent the sputtering of nanoparticles. The surface quality, microstructure, impact energy and hardness of the prepared samples were investigated. Good surface quality and dispersion of nanoparticles were attained through employing sufficient processing conditions. The experimental results indicated that as the percentage of nanoparticles increased, the impact energy and hardness increased, reaching 2 kJ/m2 and 14.7 HV at a nanoparticle concentration of 25%, which means that the toughness and the hardness of the polymer-ceramic produced composite is higher than unprocessed PMMA by 66% and 33% respectively.

Keywords: friction stir processing, polymer matrix nanocomposite, mechanical properties, microstructure

Procedia PDF Downloads 143
4044 Biosensors as Analytical Tools in Legume Processing

Authors: S. V. Ncube, A. I. O. Jideani, E. T. Gwata

Abstract:

The plight of food insecurity in developing countries has led to renewed interest in underutilized legumes. Their nutritional versatility, desirable functionality, pharmaceutical value and inherent bioactive compounds have drawn the attention of researchers. This has provoked the development of value added products with the aim of commercially exploiting their full potential. However processing of these legumes leads to changes in nutritional composition as affected by processing variables like pH, temperature and pressure. There is therefore a need for process control and quality assurance during production of the value added products. However, conventional methods for microbiological and biochemical identification are labour intensive and time-consuming. Biosensors offer rapid and affordable methods to assure the quality of the products. They may be used to quantify nutrients and anti-nutrients in the products while manipulating and monitoring variables such as pH, temperature, pressure and oxygen that affect the quality of the final product. This review gives an overview of the types of biosensors used in the food industry, their advantages and disadvantages and their possible application in processing of legumes.

Keywords: legume processing, biosensors, quality control, nutritional versatility

Procedia PDF Downloads 463
4043 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm

Authors: Hooman Torabifard

Abstract:

In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.

Keywords: image summarization, particle swarm optimization, image threshold, image processing

Procedia PDF Downloads 104
4042 Drawing Building Blocks in Existing Neighborhoods: An Automated Pilot Tool for an Initial Approach Using GIS and Python

Authors: Konstantinos Pikos, Dimitrios Kaimaris

Abstract:

Although designing building blocks is a procedure used by many planners around the world, there isn’t an automated tool that will help planners and designers achieve their goals with lesser effort. The difficulty of the subject lies in the repeating process of manually drawing lines, while not only it is mandatory to maintain the desirable offset but to also achieve a lesser impact to the existing building stock. In this paper, using Geographical Information Systems (GIS) and the Python programming language, an automated tool integrated into ArcGIS PRO, is being presented. Despite its simplistic enviroment and the lack of specialized building legislation due to the complex state of the field, a planner who is aware of such technical information can use the tool to draw an initial approach of the final building blocks in an area with pre-existing buildings in an attempt to organize the usually sprawling suburbs of a city or any continuously developing area. The tool uses ESRI’s ArcPy library to handle the spatial data, while interactions with the user is made throught Tkinter. The main process consists of a modification of building edgescoordinates, using NumPy library, in an effort to draw the line of best fit, so the user can get the optimal results per block’s side. Finally, after the tool runs successfully, a table of primary planning information is shown, such as the area of the building block and its coverage rate. Regardless of the primary stage of the tool’s development, it is a solid base where potential planners with programming skills could invest, so they can make the tool adapt to their individual needs. An example of the entire procedure in a test area is provided, highlighting both the strengths and weaknesses of the final results.

Keywords: arcPy, GIS, python, building blocks

Procedia PDF Downloads 156
4041 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity

Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink

Abstract:

The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.

Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction

Procedia PDF Downloads 286
4040 A Review of Research on Pre-training Technology for Natural Language Processing

Authors: Moquan Gong

Abstract:

In recent years, with the rapid development of deep learning, pre-training technology for natural language processing has made great progress. The early field of natural language processing has long used word vector methods such as Word2Vec to encode text. These word vector methods can also be regarded as static pre-training techniques. However, this context-free text representation brings very limited improvement to subsequent natural language processing tasks and cannot solve the problem of word polysemy. ELMo proposes a context-sensitive text representation method that can effectively handle polysemy problems. Since then, pre-training language models such as GPT and BERT have been proposed one after another. Among them, the BERT model has significantly improved its performance on many typical downstream tasks, greatly promoting the technological development in the field of natural language processing, and has since entered the field of natural language processing. The era of dynamic pre-training technology. Since then, a large number of pre-trained language models based on BERT and XLNet have continued to emerge, and pre-training technology has become an indispensable mainstream technology in the field of natural language processing. This article first gives an overview of pre-training technology and its development history, and introduces in detail the classic pre-training technology in the field of natural language processing, including early static pre-training technology and classic dynamic pre-training technology; and then briefly sorts out a series of enlightening technologies. Pre-training technology, including improved models based on BERT and XLNet; on this basis, analyze the problems faced by current pre-training technology research; finally, look forward to the future development trend of pre-training technology.

Keywords: natural language processing, pre-training, language model, word vectors

Procedia PDF Downloads 22
4039 Integrated Target Tracking and Control for Automated Car-Following of Truck Platforms

Authors: Fadwa Alaskar, Fang-Chieh Chou, Carlos Flores, Xiao-Yun Lu, Alexandre M. Bayen

Abstract:

This article proposes a perception model for enhancing the accuracy and stability of car-following control of a longitudinally automated truck. We applied a fusion-based tracking algorithm on measurements of a single preceding vehicle needed for car-following control. This algorithm fuses two types of data, radar and LiDAR data, to obtain more accurate and robust longitudinal perception of the subject vehicle in various weather conditions. The filter’s resulting signals are fed to the gap control algorithm at every tracking loop composed by a high-level gap control and lower acceleration tracking system. Several highway tests have been performed with two trucks. The tests show accurate and fast tracking of the target, which impacts on the gap control loop positively. The experiments also show the fulfilment of control design requirements, such as fast speed variations tracking and robust time gap following.

Keywords: object tracking, perception, sensor fusion, adaptive cruise control, cooperative adaptive cruise control

Procedia PDF Downloads 202
4038 The KAPSARC Energy Policy Database: Introducing a Quantified Library of China's Energy Policies

Authors: Philipp Galkin

Abstract:

Government policy is a critical factor in the understanding of energy markets. Regardless, it is rarely approached systematically from a research perspective. Gaining a precise understanding of what policies exist, their intended outcomes, geographical extent, duration, evolution, etc. would enable the research community to answer a variety of questions that, for now, are either oversimplified or ignored. Policy, on its surface, also seems a rather unstructured and qualitative undertaking. There may be quantitative components, but incorporating the concept of policy analysis into quantitative analysis remains a challenge. The KAPSARC Energy Policy Database (KEPD) is intended to address these two energy policy research limitations. Our approach is to represent policies within a quantitative library of the specific policy measures contained within a set of legal documents. Each of these measures is recorded into the database as a single entry characterized by a set of qualitative and quantitative attributes. Initially, we have focused on the major laws at the national level that regulate coal in China. However, KAPSARC is engaged in various efforts to apply this methodology to other energy policy domains. To ensure scalability and sustainability of our project, we are exploring semantic processing using automated computer algorithms. Automated coding can provide a more convenient input data for human coders and serve as a quality control option. Our initial findings suggest that the methodology utilized in KEPD could be applied to any set of energy policies. It also provides a convenient tool to facilitate understanding in the energy policy realm enabling the researcher to quickly identify, summarize, and digest policy documents and specific policy measures. The KEPD captures a wide range of information about each individual policy contained within a single policy document. This enables a variety of analyses, such as structural comparison of policy documents, tracing policy evolution, stakeholder analysis, and exploring interdependencies of policies and their attributes with exogenous datasets using statistical tools. The usability and broad range of research implications suggest a need for the continued expansion of the KEPD to encompass a larger scope of policy documents across geographies and energy sectors.

Keywords: China, energy policy, policy analysis, policy database

Procedia PDF Downloads 297
4037 Rough Neural Networks in Adapting Cellular Automata Rule for Reducing Image Noise

Authors: Yasser F. Hassan

Abstract:

The reduction or removal of noise in a color image is an essential part of image processing, whether the final information is used for human perception or for an automatic inspection and analysis. This paper describes the modeling system based on the rough neural network model to adaptive cellular automata for various image processing tasks and noise remover. In this paper, we consider the problem of object processing in colored image using rough neural networks to help deriving the rules which will be used in cellular automata for noise image. The proposed method is compared with some classical and recent methods. The results demonstrate that the new model is capable of being trained to perform many different tasks, and that the quality of these results is comparable or better than established specialized algorithms.

Keywords: rough sets, rough neural networks, cellular automata, image processing

Procedia PDF Downloads 400
4036 An Automated Approach to Consolidate Galileo System Availability

Authors: Marie Bieber, Fabrice Cosson, Olivier Schmitt

Abstract:

Europe's Global Navigation Satellite System, Galileo, provides worldwide positioning and navigation services. The satellites in space are only one part of the Galileo system. An extensive ground infrastructure is essential to oversee the satellites and ensure accurate navigation signals. High reliability and availability of the entire Galileo system are crucial to continuously provide positioning information of high quality to users. Outages are tracked, and operational availability is regularly assessed. A highly flexible and adaptive tool has been developed to automate the Galileo system availability analysis. Not only does it enable a quick availability consolidation, but it also provides first steps towards improving the data quality of maintenance tickets used for the analysis. This includes data import and data preparation, with a focus on processing strings used for classification and identifying faulty data. Furthermore, the tool allows to handle a low amount of data, which is a major constraint when the aim is to provide accurate statistics.

Keywords: availability, data quality, system performance, Galileo, aerospace

Procedia PDF Downloads 126
4035 Dynamic Store Procedures in Database

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

In recent years, different methods have been proposed to optimize question processing in database. Although different methods have been proposed to optimize the query, but the problem which exists here is that most of these methods destroy the query execution plan after executing the query. This research attempts to solve the above problem by using a combination of methods of communicating with the database (the present questions in the programming code and using store procedures) and making query processing adaptive in database, and proposing a new approach for optimization of query processing by introducing the idea of dynamic store procedures. This research creates dynamic store procedures in the database according to the proposed algorithm. This method has been tested on applied software and results shows a significant improvement in reducing the query processing time and also reducing the workload of DBMS. Other advantages of this algorithm include: making the programming environment a single environment, eliminating the parametric limitations of the stored procedures in the database, making the stored procedures in the database dynamic, etc.

Keywords: relational database, agent, query processing, adaptable, communication with the database

Procedia PDF Downloads 343
4034 Wavelet Based Signal Processing for Fault Location in Airplane Cable

Authors: Reza Rezaeipour Honarmandzad

Abstract:

Wavelet analysis is an exciting method for solving difficult problems in mathematics, physics, and engineering, with modern applications as diverse as wave propagation, data compression, signal processing, image processing, pattern recognition, etc. Wavelets allow complex information such as signals, images and patterns to be decomposed into elementary forms at different positions and scales and subsequently reconstructed with high precision. In this paper a wavelet-based signal processing algorithm for airplane cable fault location is proposed. An orthogonal discrete wavelet decomposition and reconstruction algorithm is used to eliminate the noise in the aircraft cable fault signal. The experiment result has shown that the character of emission pulse and reflect pulse used to test the aircraft cable fault point are reserved and the high-frequency noise are eliminated by means of the proposed algorithm in this paper.

Keywords: wavelet analysis, signal processing, orthogonal discrete wavelet, noise, aircraft cable fault signal

Procedia PDF Downloads 487
4033 Design of Semi-Automatic Vent and Flash Remover

Authors: Inba Blesso P., Senthil Kumar P.

Abstract:

The main consideration of any tire manufacturing process is wear resistance. One of the factors that cause tire wear is improper removal of vent and flash from the tire surface. The contact point between tyre surface and vent is highly supposed to wear. When the vehicle running at higher speed with heavy load, the tire vent and flash is wearing initially and it makes few of the tire surface material to wear along with it. Hence, provision must be given to efficient removal vent and flash thereby tire wear. Human efforts in trimming of tire vent results in time consuming and inaccurate output. Hence, this lead to the reduction in production rate and profit. Thus, the development of automated system can helps to attain minimum time consumption and provide a possible way to get the profitable production. Semi-automated system that employs Pneumatic actuators and sequencing circuits are focused in this study. By implementing this, one can achieve the accurate results with reduction in time and profitable output.

Keywords: tire manufacturing, pneumatic system, vent and flash removal, engineering and technology

Procedia PDF Downloads 345
4032 Design and Implementation of Automated Car Anti-Collision System Device Using Distance Sensor

Authors: Mehrab Masayeed Habib, Tasneem Sanjana, Ahmed Amin Rumel

Abstract:

Automated car anti-collision system is a trending technology of science. A car anti-collision system is an automobile safety system. The aim of this paper was to describe designing a car anti-collision system device to reduce the severity of an accident. The purpose of this device is to prevent collision among cars and objects to reduce the accidental death of human. This project gives an overview of secure & smooth journey of car as well as the certainty of human life. This system is controlled by microcontroller PIC. Sharp distance sensor is used to detect any object within the danger range. A crystal oscillator is used to produce the oscillation and generates the clock pulse of the microcontroller. An LCD is used to give information about the safe distance and a buzzer is used as alarm. An actuator is used as automatic break and inside the actuator; there is a motor driver that runs the actuator. For coding ‘microC PRO for PIC’ was used and ’Proteus Design Suite version 8 Software’ was used for simulation.

Keywords: sharp distance sensor, microcontroller, MicroC PRO for PIC, proteus, actuator, automobile anti-collision system

Procedia PDF Downloads 438
4031 Development of an Autonomous Automated Guided Vehicle with Robot Manipulator under Robot Operation System Architecture

Authors: Jinsiang Shaw, Sheng-Xiang Xu

Abstract:

This paper presents the development of an autonomous automated guided vehicle (AGV) with a robot arm attached on top of it within the framework of robot operation system (ROS). ROS can provide libraries and tools, including hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, etc. For this reason, this AGV can provide automatic navigation and parts transportation and pick-and-place task using robot arm for typical industrial production line use. More specifically, this AGV will be controlled by an on-board host computer running ROS software. Command signals for vehicle and robot arm control and measurement signals from various sensors are transferred to respective microcontrollers. Users can operate the AGV remotely through the TCP / IP protocol and perform SLAM (Simultaneous Localization and Mapping). An RGBD camera and LIDAR sensors are installed on the AGV, using these data to perceive the environment. For SLAM, Gmapping is used to construct the environment map by Rao-Blackwellized particle filter; and AMCL method (Adaptive Monte Carlo localization) is employed for mobile robot localization. In addition, current AGV position and orientation can be visualized by ROS toolkit. As for robot navigation and obstacle avoidance, A* for global path planning and dynamic window approach for local planning are implemented. The developed ROS AGV with a robot arm on it has been experimented in the university factory. A 2-D and 3-D map of the factory were successfully constructed by the SLAM method. Base on this map, robot navigation through the factory with and without dynamic obstacles are shown to perform well. Finally, pick-and-place of parts using robot arm and ensuing delivery in the factory by the mobile robot are also accomplished.

Keywords: automated guided vehicle, navigation, robot operation system, Simultaneous Localization and Mapping

Procedia PDF Downloads 122
4030 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 405
4029 Water Quality Calculation and Management System

Authors: H. M. B. N Jayasinghe

Abstract:

The water is found almost everywhere on Earth. Water resources contain a lot of pollution. Some diseases can be spread through the water to the living beings. So to be clean water it should undergo a number of treatments necessary to make it drinkable. So it is must to have purification technology for the wastewater. So the waste water treatment plants act a major role in these issues. When considering the procedures taken after the water treatment process was always based on manual calculations and recordings. Water purification plants may interact with lots of manual processes. It means the process taking much time consuming. So the final evaluation and chemical, biological treatment process get delayed. So to prevent those types of drawbacks there are some computerized programmable calculation and analytical techniques going to be introduced to the laboratory staff. To solve this problem automated system will be a solution in which guarantees the rational selection. A decision support system is a way to model data and make quality decisions based upon it. It is widely used in the world for the various kind of process automation. Decision support systems that just collect data and organize it effectively are usually called passive models where they do not suggest a specific decision but only reveal information. This web base system is based on global positioning data adding facility with map location. Most worth feature is SMS and E-mail alert service to inform the appropriate person on a critical issue. The technological influence to the system is HTML, MySQL, PHP, and some other web developing technologies. Current issues in the computerized water chemistry analysis are not much deep in progress. For an example the swimming pool water quality calculator. The validity of the system has been verified by test running and comparison with an existing plant data. Automated system will make the life easier in productively and qualitatively.

Keywords: automated system, wastewater, purification technology, map location

Procedia PDF Downloads 224
4028 Effect of Sub Supercritical CO2 Processing on Microflora and Shelf Life Tempe

Authors: M. Kustyawati, F. Pratama, D. Saputra, A. Wijaya

Abstract:

Tempe composes of not only molds but also bacteria and yeasts. The structure of microorganisms needs to be in balance number in order the tempe to be an acceptable quality for an extended time. Sub supercritical carbon dioxide can be a promising preservation method for tempe as it induces microbial inactivation avoiding alterations of its quality attributes. Fresh tempe were processed using supercritical and sub supercritical CO2 for a defined holding times, then the growth ability of molds and bacteria were analyzed. The results showed that the supercritical CO2 processing for 5 minutes reduced the number of bacteria and molds to 0.30 log cycle and 1.17 log cycles, respectively. In addition, sub supercritical CO2 processing for 20 minutes had fungicidal effect against mold tempe; whereas, the sub supercritical CO2 for 10 minutes had reducing effect against bacteria tempe, and had fungistatic affect against mold tempe. It suggested that sub-supercritical CO2 processing for 10 min could be useful alternative technique for preservation of tempe.

Keywords: tempe, sub supercritical CO2, fungistatic effect, preservation

Procedia PDF Downloads 243
4027 Business-to-Business Deals Based on a Co-Utile Collaboration Mechanism: Designing Trust Company of the Future

Authors: Riccardo Bonazzi, Michaël Poli, Abeba Nigussie Turi

Abstract:

This paper presents an applied research of a new module for the financial administration and management industry, Personalizable and Automated Checklists Integrator, Overseeing Legal Investigations (PACIOLI). It aims at designing the business model of the trust company of the future. By identifying the key stakeholders, we draw a general business process design of the industry. The business model focuses on disintermediating the traditional form of business through the new technological solutions of a software company based in Switzerland and hence creating a new interactive platform. The key stakeholders of this interactive platform are identified as IT experts, legal experts, and the New Edge Trust Company (NATC). The mechanism we design and propose has a great importance in improving the efficiency of the financial business administration and management industry, and it also helps to foster the provision of high value added services in the sector.

Keywords: new edge trust company, business model design, automated checklists, financial technology

Procedia PDF Downloads 335
4026 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 99
4025 Autism Disease Detection Using Transfer Learning Techniques: Performance Comparison between Central Processing Unit vs. Graphics Processing Unit Functions for Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

Neural network approaches are machine learning methods used in many domains, such as healthcare and cyber security. Neural networks are mostly known for dealing with image datasets. While training with the images, several fundamental mathematical operations are carried out in the Neural Network. The operation includes a number of algebraic and mathematical functions, including derivative, convolution, and matrix inversion and transposition. Such operations require higher processing power than is typically needed for computer usage. Central Processing Unit (CPU) is not appropriate for a large image size of the dataset as it is built with serial processing. While Graphics Processing Unit (GPU) has parallel processing capabilities and, therefore, has higher speed. This paper uses advanced Neural Network techniques such as VGG16, Resnet50, Densenet, Inceptionv3, Xception, Mobilenet, XGBOOST-VGG16, and our proposed models to compare CPU and GPU resources. A system for classifying autism disease using face images of an autistic and non-autistic child was used to compare performance during testing. We used evaluation matrices such as Accuracy, F1 score, Precision, Recall, and Execution time. It has been observed that GPU runs faster than the CPU in all tests performed. Moreover, the performance of the Neural Network models in terms of accuracy increases on GPU compared to CPU.

Keywords: autism disease, neural network, CPU, GPU, transfer learning

Procedia PDF Downloads 84
4024 The Integration and Automation of EDA Tools in an Integrated Circuit Design Environment

Authors: Rohaya Abdul Wahab, Raja Mohd Fuad Tengku Aziz, Nazaliza Othman, Sharifah Saleh, Nabihah Razali, Rozaimah Baharim, M. Hanif M. Nasir

Abstract:

This paper will discuss how EDA tools are integrated and automated in an Integrated Circuit Design Environment. Some of the problems face in our current environment is that users need to configure manually on the library paths, start-up files and project directories. Certain manual processes that happen between the users and applications can be automated but they must be transparent to the users. For example, the users can run the applications directly after login without knowing the library paths and start-up files locations. The solution to these problems is to automate the processes using standard configuration files which will benefit the users and EDA support. This paper will discuss how the implementation is done to automate the process using scripting languages such as Perl, Tcl, Scheme and Shell Script. These scripting tools are great assets for design engineers to build a robust and powerful design flow and this technique is widely used to integrate all the tools together.

Keywords: EDA tools, Integrated Circuits, scripting, integration, automation

Procedia PDF Downloads 290
4023 Adapting Liability in the Era of Automated Decision-Making: A South African Labour Law Perspective

Authors: Aisha Adam

Abstract:

This study critically examines the transformative impact of automated decision-making (ADM) and artificial intelligence (AI) systems on South African labour law. As AI technologies increasingly infiltrate workplaces, existing liability frameworks face challenges in addressing the unique complexities presented by these innovations. This article explores the necessity of redefining liability to accommodate the nuanced landscape of ADM and AI within South African labour law. It emphasises the importance of ensuring responsible deployment and safeguarding the rights of workers amid evolving technological dynamics. This research investigates the central concern of fairness, bias, and discrimination in ADM and AI decision-making. Focusing on algorithmic bias and discriminatory outcomes, the paper advocates for the integration of mechanisms within the South African legal framework, particularly under the Promotion of Equality and Prevention of Unfair Discrimination Act (PEPUDA) and the Employment Equity Act (EEA). The study scrutinises the shifting dynamics of the employment relationship, calling for clear guidelines on the responsibilities and liabilities of employers, employees, and technology providers. Furthermore, the article analyses legal and policy responses to ADM and AI within South African labour law, exploring potential amendments to legislation, guidelines, and codes of practice. It assesses the role of regulatory bodies, specifically the Commission for Conciliation, Mediation, and Arbitration (CCMA), in overseeing and enforcing responsible practices in the workplace. Lastly, the research evaluates the impact of ADM and AI on human and social rights in the South African context. Emphasising the protection of constitutional rights, including fair labour practices, privacy, and equality, the study proposes remedies and safeguards. It advocates for a multidisciplinary approach involving legal, technological, and ethical considerations to redefine liability in South African labour law effectively. The article contends that a shift from accountability to responsibility is crucial for promoting fairness, antidiscrimination, and the protection of human and social rights in the age of automated decision-making. It calls for collaborative efforts among stakeholders to shape responsible practices and redefine liability in this evolving technological landscape.

Keywords: automated decision-making, artificial intelligence, labour law, vicarious liability

Procedia PDF Downloads 45
4022 Analysis and Improvement of Efficiency for Food Processing Assembly Lines

Authors: Mehmet Savsar

Abstract:

Several factors affect productivity of Food Processing Assembly Lines (FPAL). Engineers and line managers usually do not recognize some of these factors and underutilize their production/assembly lines. In this paper, a special food processing assembly line is studied in detail, and procedures are presented to illustrate how productivity and efficiency of such lines can be increased. The assembly line considered produces ten different types of freshly prepared salads on the same line, which is called mixed model assembly line. Problems causing delays and inefficiencies on the line are identified. Line balancing and related tools are used to increase line efficiency and minimize balance delays. The procedure and the approach utilized in this paper can be useful for the operation managers and industrial engineers dealing with similar assembly lines in food processing industry.

Keywords: assembly lines, line balancing, production efficiency, bottleneck

Procedia PDF Downloads 353
4021 Generation of Automated Alarms for Plantwide Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.

Keywords: detection, monitoring, process data, noise

Procedia PDF Downloads 215
4020 Building Information Modeling Applied for the Measurement of Water Footprint of Construction Supplies

Authors: Julio Franco

Abstract:

Water is used, directly and indirectly, in all activities of the construction productive chain, making it a subject of worldwide relevance for sustainable development. The ongoing expansion of urban areas leads to a high demand for natural resources, which in turn cause significant environmental impacts. The present work proposes the application of BIM tools to assist the measurement of the water footprint (WF) of civil construction supplies. Data was inserted into the model as element properties, allowing them to be analyzed by element or in the whole model. The WF calculation was automated using parameterization in Autodesk Revit software. Parameterization was associated to the materials of each element in the model so that any changes in these elements directly alter the results of WF calculations. As a case study, we applied into a building project model to test the parameterized calculus of WF. Results show that the proposed parameterization successfully automated WF calculations according to design changes. We envision this tool to assist the measurement and rationalization of the environmental impact in terms of WF of construction projects.

Keywords: building information modeling, BIM, sustainable development, water footprint

Procedia PDF Downloads 121