Search results for: automated teller machines (atm)
1252 Classification of Sequential Sports Using Automata Theory
Authors: Aniket Alam, Sravya Gurram
Abstract:
This paper proposes a categorization of sport that is based on the system of rules that a sport must adhere to. We focus on these systems of rules to examine how a winner is produced in different sports. The rules of a sport dictate the game play and the direction it takes. We propose to break down the game play into events. At this junction, we observe two kinds of events that constitute the game play of a sport –ones that follow sequential logic and ones that do not. Our focus is pertained to sports that are comprised of sequential events. To examine these events further, to understand how a winner emerges, we take the help of finite-state automaton from the theory of computation (Automata theory). We showcase how sequential sports are eligible to be represented as finite state machines. We depict these finite state machines as state diagrams. We examine these state diagrams to observe how a team/player reaches the final states of the sport, with a special focus on one final state –the final state which determines the winner. This exercise has been carried out for the following sports: Hurdles, Track, Shot Put, Long Jump, Bowling, Badminton, Pacman and Weightlifting (Snatch). Based on our observations of how this final state of winning is achieved, we propose a categorization of sports.Keywords: sport classification, sport modelling, ontology, automata theory
Procedia PDF Downloads 1191251 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 4311250 Machine Learning Predictive Models for Hydroponic Systems: A Case Study Nutrient Film Technique and Deep Flow Technique
Authors: Kritiyaporn Kunsook
Abstract:
Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), decision tree, support vector machines (SVMs), Naïve Bayes, and ensemble classifier by voting are powerful data driven methods that are relatively less widely used in the mapping of technique of system, and thus have not been comparatively evaluated together thoroughly in this field. The performances of a series of MLAs, ANNs, decision tree, SVMs, Naïve Bayes, and ensemble classifier by voting in technique of hydroponic systems prospectively modeling are compared based on the accuracy of each model. Classification of hydroponic systems only covers the test samples from vegetables grown with Nutrient film technique (NFT) and Deep flow technique (DFT). The feature, which are the characteristics of vegetables compose harvesting height width, temperature, require light and color. The results indicate that the classification performance of the ANNs is 98%, decision tree is 98%, SVMs is 97.33%, Naïve Bayes is 96.67%, and ensemble classifier by voting is 98.96% algorithm respectively.Keywords: artificial neural networks, decision tree, support vector machines, naïve Bayes, ensemble classifier by voting
Procedia PDF Downloads 3721249 Application of the Material Point Method as a New Fast Simulation Technique for Textile Composites Forming and Material Handling
Authors: Amir Nazemi, Milad Ramezankhani, Marian Kӧrber, Abbas S. Milani
Abstract:
The excellent strength to weight ratio of woven fabric composites, along with their high formability, is one of the primary design parameters defining their increased use in modern manufacturing processes, including those in aerospace and automotive. However, for emerging automated preform processes under the smart manufacturing paradigm, complex geometries of finished components continue to bring several challenges to the designers to cope with manufacturing defects on site. Wrinklinge. g. is a common defectoccurring during the forming process and handling of semi-finished textile composites. One of the main reasons for this defect is the weak bending stiffness of fibers in unconsolidated state, causing excessive relative motion between them. Further challenges are represented by the automated handling of large-area fiber blanks with specialized gripper systems. For fabric composites forming simulations, the finite element (FE)method is a longstanding tool usedfor prediction and mitigation of manufacturing defects. Such simulations are predominately meant, not only to predict the onset, growth, and shape of wrinkles but also to determine the best processing condition that can yield optimized positioning of the fibers upon forming (or robot handling in the automated processes case). However, the need for use of small-time steps via explicit FE codes, facing numerical instabilities, as well as large computational time, are among notable drawbacks of the current FEtools, hindering their extensive use as fast and yet efficient digital twins in industry. This paper presents a novel woven fabric simulation technique through the application of the material point method (MPM), which enables the use of much larger time steps, facing less numerical instabilities, hence the ability to run significantly faster and efficient simulationsfor fabric materials handling and forming processes. Therefore, this method has the ability to enhance the development of automated fiber handling and preform processes by calculating the physical interactions with the MPM fiber models and rigid tool components. This enables the designers to virtually develop, test, and optimize their processes based on either algorithmicor Machine Learning applications. As a preliminary case study, forming of a hemispherical plain weave is shown, and the results are compared to theFE simulations, as well as experiments.Keywords: material point method, woven fabric composites, forming, material handling
Procedia PDF Downloads 1811248 Finite Element Method Analysis of a Modified Rotor 6/4 Switched Reluctance Motor's and Comparison with Brushless Direct Current Motor in Pan-Tilt Applications
Authors: Umit Candan, Kadir Dogan, Ozkan Akin
Abstract:
In this study, the use of a modified rotor 6/4 Switched Reluctance Motor (SRM) and a Brushless Direct Current Motor (BLDC) in pan-tilt systems is compared. Pan-tilt systems are critical mechanisms that enable the precise orientation of cameras and sensors, and their performance largely depends on the characteristics of the motors used. The aim of the study is to determine how the performance of the SRM can be improved through rotor modifications and how these improvements can compete with BLDC motors. Using Finite Element Method (FEM) analyses, the design characteristics and magnetic performance of the 6/4 Switched Reluctance Motor are examined in detail. The modified SRM is found to offer increased torque capacity and efficiency while standing out with its simple construction and robustness. FEM analysis results of SRM indicate that considering its cost-effectiveness and performance improvements achieved through modifications, the SRM is a strong alternative for certain pan-tilt applications. This study aims to provide engineers and researchers with a performance comparison of the modified rotor 6/4 SRM and BLDC motors in pan-tilt systems, helping them make more informed and effective motor selections.Keywords: reluctance machines, switched reluctance machines, pan-tilt application, comparison, FEM analysis
Procedia PDF Downloads 591247 CO₂ Capture by Clay and Its Adsorption Mechanism
Authors: Jedli Hedi, Hedfi Hachem, Abdessalem Jbara, Slimi Khalifa
Abstract:
Natural and modified clay were used as an adsorbent for CO2 capture. Sample of clay was subjected to acid treatments to improve their textural properties, namely, its surface area and pore volume. The modifications were carried out by heating the clays at 120 °C and then by acid treatment with 3M sulphuric acid solution at boiling temperature for 10 h. The CO2 adsorption capacities of the acid-treated clay were performed out in a batch reactor. It was found that the clay sample treated with 3M H2SO4 exhibited the highest Brunauer–Emmett–Teller (BET) surface area (16.29–24.68 m2/g) and pore volume (0.056–0.064 cm3/g). After the acid treatment, the CO2 adsorption capacity of clay increased. The CO2 adsorption capacity of clay increased after the acid treatment. The CO2 adsorption by clay, were characterized by SEM, FTIR, ATD-ATG and BET method. For describing the phenomenon of CO2 adsorption for these materials, the adsorption isotherms were modeled using the Freundlich and Langmuir models. CO2 adsorption isotherm was found attributable to physical adsorption.Keywords: clay, acid treatment, CO2 capture, adsorption mechanism
Procedia PDF Downloads 2111246 Reliability Analysis of Variable Stiffness Composite Laminate Structures
Authors: A. Sohouli, A. Suleman
Abstract:
This study focuses on reliability analysis of variable stiffness composite laminate structures to investigate the potential structural improvement compared to conventional (straight fibers) composite laminate structures. A computational framework was developed which it consists of a deterministic design step and reliability analysis. The optimization part is Discrete Material Optimization (DMO) and the reliability of the structure is computed by Monte Carlo Simulation (MCS) after using Stochastic Response Surface Method (SRSM). The design driver in deterministic optimization is the maximum stiffness, while optimization method concerns certain manufacturing constraints to attain industrial relevance. These manufacturing constraints are the change of orientation between adjacent patches cannot be too large and the maximum number of successive plies of a particular fiber orientation should not be too high. Variable stiffness composites may be manufactured by Automated Fiber Machines (AFP) which provides consistent quality with good production rates. However, laps and gaps are the most important challenges to steer fibers that effect on the performance of the structures. In this study, the optimal curved fiber paths at each layer of composites are designed in the first step by DMO, and then the reliability analysis is applied to investigate the sensitivity of the structure with different standard deviations compared to the straight fiber angle composites. The random variables are material properties and loads on the structures. The results show that the variable stiffness composite laminate structures are much more reliable, even for high standard deviation of material properties, than the conventional composite laminate structures. The reason is that the variable stiffness composite laminates allow tailoring stiffness and provide the possibility of adjusting stress and strain distribution favorably in the structures.Keywords: material optimization, Monte Carlo simulation, reliability analysis, response surface method, variable stiffness composite structures
Procedia PDF Downloads 5201245 Arsenic(III) Removal from Aqueous Solutions by Adsorption onto Fly Ash
Authors: Olushola Ayanda, Simphiwe Nelana, Eliazer Naidoo
Abstract:
In the present study, the kinetics, equilibrium and thermodynamics of the adsorption of As(III) ions from aqueous solution onto fly ash (FA) was investigated in batch adsorption system. Prior to the adsorption studies, the FA was characterized by means of x-ray fluorescence (XRF), x-ray diffraction (XRD), scanning electron microscopy (SEM), transmission electron microscopy (TEM) and Brunauer-Emmett-Teller (BET) surface area determination. The effect of contact time, initial As(III) concentration, FA dosage, stirring speed, solution pH and temperature was examined on the adsorption rate. Experimental results showed a very good compliance with the pseudo-second-order equation, while the equilibrium study showed that the sorption of As(III) ions onto FA fitted the Langmuir and Freundlich isotherms. The adsorption process is endothermic and spontaneous, moreover, the maximum percentage removal of As(III) achieved with approx. 2.5 g FA mixed with 25 mL of 100 mg/L As(III) solution was 65.4 % at pH 10, 60 min contact time, temperature of 353 K and a stirring speed of 120 rpm.Keywords: arsenic, fly ash, kinetics, isotherm, thermodynamics
Procedia PDF Downloads 2411244 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 441243 Machines Hacking Humans: Performances Practices in Electronic Music during the 21st Century
Authors: Zimasa Siyasanga Gysman
Abstract:
This paper assesses the history of electronic music and its performance to illustrate that machines and technology have largely influenced how humans perform electronic music. The history of electronic music mainly focuses on the composition and production of electronic music with little to no attention paid to its performance by the majority of scholars in this field. Therefore, establishing a history of performance involves investigating what compositions of electronic music called for in the production of electronic music performance. This investigation into seminal works in the history of electronic music, therefore, illustrates the aesthetics of electronic music performance and the aesthetics established in the very beginnings of electronic music performance demonstrate the aesthetics of electronic music which are still prevalent today. The key aesthetics are the repurposing of technology and the hybridisation of technology. Performers take familiar technology (technology that society has become accustomed to using in daily life), not necessarily related to music or performance and use it as an instrument in their performances, such as a rotary dial telephone. Likewise, since the beginnings of electronic music, producers have always experimented with the latest technologies available to them in their compositions and performances. The spirit of performers of electronic music, therefore, revolves around repurposing familiar technologies and using them in new ways, whilst similarly experimenting with new technologies in their performances. This process of hybridisation plays a key role in the production and performance of electronic music in the twentieth century. Through various interviews with performers of electronic music, it is shown that these aesthetics are driving performance practices in the twenty-first century.Keywords: body, hybridisation, performance, sound
Procedia PDF Downloads 1611242 Degree Tracking System (DTS) to Improve the Efficiency and Effectiveness of Open Distance Learning System: A Case Study of Islamabad Allama Iqbal Open University (AIOU)
Authors: Hatib Shabbir
Abstract:
Student support services play an important role in providing technical and motivational support to distance learner. ICT based systems have improved the efficiency and effectiveness of support services. In distance education, students being at distant require quick responses from their institution. In the manual system, it is practically hard to give prompt response to each and every student, so as a result student has to suffer a lot. The best way to minimize inefficiencies is to use automated systems. This project involves the development of centralized automated software that would not only replace the manual degree issuance system of 1.3 million students studying at AIOU but also provide online tracking to all the students applying for Degrees. DTS is also the first step towards the paperless culture which is adopted by the major organizations of the world. DTS would not only save university cost but also save students cost and time too by conveying all the information/objection through email and SMS. Moreover, DTS also monitors the performance of each and every individual working in the exam department AIOU and generates daily, monthly and yearly reports of every individual which helps a lot in continuous performance monitoring of the employees.Keywords: aiou dts, dts aiou, dts, degree tracking aiou
Procedia PDF Downloads 2181241 Machine Learning Approach for Automating Electronic Component Error Classification and Detection
Authors: Monica Racha, Siva Chandrasekaran, Alex Stojcevski
Abstract:
The engineering programs focus on promoting students' personal and professional development by ensuring that students acquire technical and professional competencies during four-year studies. The traditional engineering laboratory provides an opportunity for students to "practice by doing," and laboratory facilities aid them in obtaining insight and understanding of their discipline. Due to rapid technological advancements and the current COVID-19 outbreak, the traditional labs were transforming into virtual learning environments. Aim: To better understand the limitations of the physical laboratory, this research study aims to use a Machine Learning (ML) algorithm that interfaces with the Augmented Reality HoloLens and predicts the image behavior to classify and detect the electronic components. The automated electronic components error classification and detection automatically detect and classify the position of all components on a breadboard by using the ML algorithm. This research will assist first-year undergraduate engineering students in conducting laboratory practices without any supervision. With the help of HoloLens, and ML algorithm, students will reduce component placement error on a breadboard and increase the efficiency of simple laboratory practices virtually. Method: The images of breadboards, resistors, capacitors, transistors, and other electrical components will be collected using HoloLens 2 and stored in a database. The collected image dataset will then be used for training a machine learning model. The raw images will be cleaned, processed, and labeled to facilitate further analysis of components error classification and detection. For instance, when students conduct laboratory experiments, the HoloLens captures images of students placing different components on a breadboard. The images are forwarded to the server for detection in the background. A hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm will be used to train the dataset for object recognition and classification. The convolution layer extracts image features, which are then classified using Support Vector Machine (SVM). By adequately labeling the training data and classifying, the model will predict, categorize, and assess students in placing components correctly. As a result, the data acquired through HoloLens includes images of students assembling electronic components. It constantly checks to see if students appropriately position components in the breadboard and connect the components to function. When students misplace any components, the HoloLens predicts the error before the user places the components in the incorrect proportion and fosters students to correct their mistakes. This hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm automating electronic component error classification and detection approach eliminates component connection problems and minimizes the risk of component damage. Conclusion: These augmented reality smart glasses powered by machine learning provide a wide range of benefits to supervisors, professionals, and students. It helps customize the learning experience, which is particularly beneficial in large classes with limited time. It determines the accuracy with which machine learning algorithms can forecast whether students are making the correct decisions and completing their laboratory tasks.Keywords: augmented reality, machine learning, object recognition, virtual laboratories
Procedia PDF Downloads 1341240 Intelligent Campus Monitoring: YOLOv8-Based High-Accuracy Activity Recognition
Authors: A. Degale Desta, Tamirat Kebamo
Abstract:
Background: Recent advances in computer vision and pattern recognition have significantly improved activity recognition through video analysis, particularly with the application of Deep Convolutional Neural Networks (CNNs). One-stage detectors now enable efficient video-based recognition by simultaneously predicting object categories and locations. Such advancements are highly relevant in educational settings where CCTV surveillance could automatically monitor academic activities, enhancing security and classroom management. However, current datasets and recognition systems lack the specific focus on campus environments necessary for practical application in these settings.Objective: This study aims to address this gap by developing a dataset and testing an automated activity recognition system specifically tailored for educational campuses. The EthioCAD dataset was created to capture various classroom activities and teacher-student interactions, facilitating reliable recognition of academic activities using deep learning models. Method: EthioCAD, a novel video-based dataset, was created with a design science research approach to encompass teacher-student interactions across three domains and 18 distinct classroom activities. Using the Roboflow AI framework, the data was processed, with 4.224 KB of frames and 33.485 MB of images managed for frame extraction, labeling, and organization. The Ultralytics YOLOv8 model was then implemented within Google Colab to evaluate the dataset’s effectiveness, achieving high mean Average Precision (mAP) scores. Results: The YOLOv8 model demonstrated robust activity recognition within campus-like settings, achieving an mAP50 of 90.2% and an mAP50-95 of 78.6%. These results highlight the potential of EthioCAD, combined with YOLOv8, to provide reliable detection and classification of classroom activities, supporting automated surveillance needs on educational campuses. Discussion: The high performance of YOLOv8 on the EthioCAD dataset suggests that automated activity recognition for surveillance is feasible within educational environments. This system addresses current limitations in campus-specific data and tools, offering a tailored solution for academic monitoring that could enhance the effectiveness of CCTV systems in these settings. Conclusion: The EthioCAD dataset, alongside the YOLOv8 model, provides a promising framework for automated campus activity recognition. This approach lays the groundwork for future advancements in CCTV-based educational surveillance systems, enabling more refined and reliable monitoring of classroom activities.Keywords: deep CNN, EthioCAD, deep learning, YOLOv8, activity recognition
Procedia PDF Downloads 121239 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines
Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.
Abstract:
Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition
Procedia PDF Downloads 5741238 Soil Compaction by a Forwarder in Timber Harvesting
Authors: Juang R. Matangaran, Erianto I. Putra, Iis Diatin, Muhammad Mujahid, Qi Adlan
Abstract:
Industrial plantation forest is the producer of logs in Indonesia. Several companies of industrial plantation forest have been successfully planted with fast-growing species, and it entered their annual harvesting period. Heavy machines such as forwarders are used in timber harvesting to extract logs from stump to landing site. The negative impact of using such machines are loss of topsoil and soil compaction. Compacted soil is considered unfavorable for plant growth. The research objectives were to analyze the soil bulk density, rut, and cone index of the soil caused by a forwarder passes, to analyze the relation between several times of forwarder passes to the increase of soil bulk density. A Valmet forwarder was used in this research. Soil bulk density at soil surface and cone index from the soil surface to the 50 cm depth of soil were measured at the harvested area. The result showed that soil bulk density increase with the increase of the Valmet forwarder passes. Maximum soil bulk density occurred after 5 times forwarder Valmet passed. The cone index tended to increase from the surface until 50 cm depth of soil. Rut formed and high soil bulk density indicated the soil compaction occurred by the forwarder operation.Keywords: bulk density, forwarder Valmet, plantation forest, soil compaction, timber harvesting
Procedia PDF Downloads 1461237 Importance of Occupational Safety and Health in Dam Construction Site
Authors: Naci Büyükkaraciğan, Yildirim Akyol
Abstract:
Large plants that covering the back and accumulate water of a river valley for energy production, drinking, irrigation water supply, economic benefits that serve many purposes, such as regulation of flood protection, are called dams. Place, in which unites in order to achieve an optimum balance between manpower for Lowest cost and economic as belonging to that structure to create machines, materials and construction of the project, is called as the site. Dam construction sites are combined sites in together in many businesses. Therefore, there can be found in the many workers and machines are many accidents in this type of construction sites. The necessity of systematic and scientific studies due to various reasons arises in order to be protected from conditions that could damage the health, During the execution of the work on construction sites. Occupational health and safety of the study, called the case, also in the European Union has begun to be addressed by weight since the 1980s. In particular, issued in 1989 89/391/EEC on occupational health and safety directive, occupational health and adopted the Directive within the framework of the security field, and then exposed to a large number of individual directive within this framework on the basis of the directive. Turkey's Law No. 6331 entered into force in June 2012 on the subject. In this study, measures related to the construction site of the dam should be taken with occupational safety and health have been examined and tried to put forward recommendations on the subject.Keywords: civil engineering, dam, occupational safety and health, site organizations
Procedia PDF Downloads 3331236 Fourier Transform and Machine Learning Techniques for Fault Detection and Diagnosis of Induction Motors
Authors: Duc V. Nguyen
Abstract:
Induction motors are widely used in different industry areas and can experience various kinds of faults in stators and rotors. In general, fault detection and diagnosis techniques for induction motors can be supervised by measuring quantities such as noise, vibration, and temperature. The installation of mechanical sensors in order to assess the health conditions of a machine is typically only done for expensive or load-critical machines, where the high cost of a continuous monitoring system can be Justified. Nevertheless, induced current monitoring can be implemented inexpensively on machines with arbitrary sizes by using current transformers. In this regard, effective and low-cost fault detection techniques can be implemented, hence reducing the maintenance and downtime costs of motors. This work proposes a method for fault detection and diagnosis of induction motors, which combines classical fast Fourier transform and modern/advanced machine learning techniques. The proposed method is validated on real-world data and achieves a precision of 99.7% for fault detection and 100% for fault classification with minimal expert knowledge requirement. In addition, this approach allows users to be able to optimize/balance risks and maintenance costs to achieve the highest benet based on their requirements. These are the key requirements of a robust prognostics and health management system.Keywords: fault detection, FFT, induction motor, predictive maintenance
Procedia PDF Downloads 1701235 Knowledge Diffusion via Automated Organizational Cartography (Autocart)
Authors: Mounir Kehal
Abstract:
The post-globalization epoch has placed businesses everywhere in new and different competitive situations where knowledgeable, effective and efficient behavior has come to provide the competitive and comparative edge. Enterprises have turned to explicit - and even conceptualizing on tacit - knowledge management to elaborate a systematic approach to develop and sustain the intellectual capital needed to succeed. To be able to do that, you have to be able to visualize your organization as consisting of nothing but knowledge and knowledge flows, whilst being presented in a graphical and visual framework, referred to as automated organizational cartography. Hence, creating the ability of further actively classifying existing organizational content evolving from and within data feeds, in an algorithmic manner, potentially giving insightful schemes and dynamics by which organizational know-how is visualized. It is discussed and elaborated on most recent and applicable definitions and classifications of knowledge management, representing a wide range of views from mechanistic (systematic, data driven) to a more socially (psychologically, cognitive/metadata driven) orientated. More elaborate continuum models, for knowledge acquisition and reasoning purposes, are being used for effectively representing the domain of information that an end user may contain in their decision making process for utilization of available organizational intellectual resources (i.e. Autocart). In this paper, we present an empirical research study conducted previously to try and explore knowledge diffusion in a specialist knowledge domain.Keywords: knowledge management, knowledge maps, knowledge diffusion, organizational cartography
Procedia PDF Downloads 3091234 Quality Control Assessment of X-Ray Equipment in Hospitals of Katsina State, Nigeria
Authors: Aminu Yakubu Umar
Abstract:
X-ray is the major contributor to the effective dose of both the patient and the personnel. Because of the radiological risks involved, it is usually recommended that dose to patient from X-ray be kept as low as reasonably achievable (ALARA) with adequate image quality. The implementation of quality assurance in diagnostic radiology can help greatly in achieving that, as it is a technique designed to reduce X-ray doses to patients undergoing radiological examination. In this study, quality control was carried out in six hospitals, which involved KVp test, evaluation of total filtration, test for constancy of radiation output, and check for mA linearity. Equipment used include KVp meter, Rad-check meter, aluminum sheets (0.1–1.0 mm) etc. The results of this study indicate that, the age of the X-ray machines in the hospitals ranges from 3-13 years, GHI and GH2 being the oldest and FMC being the newest. In the evaluation of total filtration, the HVL of the X-ray machines in the hospitals varied, ranging from 2.3-5.2 mm. The HVL was found to be highest in AHC (5.2 mm), while it was lowest in GH3 (2.3 mm). All HVL measurements were done at 80 KVp. The variation in voltage accuracy in the hospitals ranges from 0.3%-127.5%. It was only in GH1 that the % variation was below the allowed limit. The test for constancy of radiation output showed that, the coefficient of variation ranges from 0.005–0.550. In GH3, FMC and AHC, the coefficient of linearity were less than the allowed limit, while in GH1, GH2 and GH4 the coefficient of linearity had exceeded the allowed limit. As regard to mA linearity, FMC and AHC had their coefficients of linearity as 0.12 and 0.10 respectively, which were within the accepted limit, while GH1, GH3 and GH4 had their coefficients as 0.16, 0.69 and 0.98 respectively, which exceeded the allowed limit.Keywords: radiation, X-ray output, quality control, half-value layer, mA linearity, KVp variation
Procedia PDF Downloads 6101233 Development of an Automatic Monitoring System Based on the Open Architecture Concept
Authors: Andrii Biloshchytskyi, Serik Omirbayev, Alexandr Neftissov, Sapar Toxanov, Svitlana Biloshchytska, Adil Faizullin
Abstract:
Kazakhstan has adopted a carbon neutrality strategy until 2060. In accordance with this strategy, it is necessary to introduce various tools to maintain the environmental safety of the environment. The use of IoT, in combination with the characteristics and requirements of Kazakhstan's environmental legislation, makes it possible to develop a modern environmental monitoring system. The article proposes a solution for developing an example of an automated system for the continuous collection of data on the concentration of pollutants in the atmosphere based on an open architecture. The Audino-based device acts as a microcontroller. It should be noted that the transmission of measured values is carried out via an open wireless communication protocol. The architecture of the system, which was used to build a prototype based on sensors, an Arduino microcontroller, and a wireless data transmission module, is presented. The selection of elementary components may change depending on the requirements of the system; the introduction of new units is limited by the number of ports. The openness of solutions allows you to change the configuration depending on the conditions. The advantages of the solutions are openness, low cost, versatility and mobility. However, there is no comparison of the working processes of the proposed solution with traditional ones.Keywords: environmental monitoring, greenhouse gases emissions, environmental pollution, Industry 4.0, IoT, microcontroller, automated monitoring system.
Procedia PDF Downloads 481232 Automated Human Balance Assessment Using Contactless Sensors
Authors: Justin Tang
Abstract:
Balance tests are frequently used to diagnose concussions on the sidelines of sporting events. Manual scoring, however, is labor intensive and subjective, and many concussions go undetected. This study institutes a novel approach to conducting the Balance Error Scoring System (BESS) more quantitatively using Microsoft’s gaming system Kinect, which uses a contactless sensor and several cameras to receive data and estimate body limb positions. Using a machine learning approach, Visual Gesture Builder, and a deterministic approach, MATLAB, we tested whether the Kinect can differentiate between “correct” and erroneous stances of the BESS. We created the two separate solutions by recording test videos to teach the Kinect correct stances and by developing a code using Java. Twenty-two subjects were asked to perform a series of BESS tests while the Kinect was collecting data. The Kinect recorded the subjects and mapped key joints onto their bodies to obtain angles and measurements that are interpreted by the software. Through VGB and MATLAB, the videos are analyzed to enumerate the number of errors committed during testing. The resulting statistics demonstrate a high correlation between manual scoring and the Kinect approaches, indicating the viability of the use of remote tracking devices in conducting concussion tests.Keywords: automated, concussion detection, contactless sensors, microsoft kinect
Procedia PDF Downloads 3171231 Automated Multisensory Data Collection System for Continuous Monitoring of Refrigerating Appliances Recycling Plants
Authors: Georgii Emelianov, Mikhail Polikarpov, Fabian Hübner, Jochen Deuse, Jochen Schiemann
Abstract:
Recycling refrigerating appliances plays a major role in protecting the Earth's atmosphere from ozone depletion and emissions of greenhouse gases. The performance of refrigerator recycling plants in terms of material retention is the subject of strict environmental certifications and is reviewed periodically through specialized audits. The continuous collection of Refrigerator data required for the input-output analysis is still mostly manual, error-prone, and not digitalized. In this paper, we propose an automated data collection system for recycling plants in order to deduce expected material contents in individual end-of-life refrigerating appliances. The system utilizes laser scanner measurements and optical data to extract attributes of individual refrigerators by applying transfer learning with pre-trained vision models and optical character recognition. Based on Recognized features, the system automatically provides material categories and target values of contained material masses, especially foaming and cooling agents. The presented data collection system paves the way for continuous performance monitoring and efficient control of refrigerator recycling plants.Keywords: automation, data collection, performance monitoring, recycling, refrigerators
Procedia PDF Downloads 1641230 The Role of Twitter Bots in Political Discussion on 2019 European Elections
Authors: Thomai Voulgari, Vasilis Vasilopoulos, Antonis Skamnakis
Abstract:
The aim of this study is to investigate the effect of the European election campaigns (May 23-26, 2019) on Twitter achieving with artificial intelligence tools such as troll factories and automated inauthentic accounts. Our research focuses on the last European Parliamentary elections that took place between 23 and 26 May 2019 specifically in Italy, Greece, Germany and France. It is difficult to estimate how many Twitter users are actually bots (Echeverría, 2017). Detection for fake accounts is becoming even more complicated as AI bots are made more advanced. A political bot can be programmed to post comments on a Twitter account for a political candidate, target journalists with manipulated content or engage with politicians and artificially increase their impact and popularity. We analyze variables related to 1) the scope of activity of automated bots accounts and 2) degree of coherence and 3) degree of interaction taking into account different factors, such as the type of content of Twitter messages and their intentions, as well as the spreading to the general public. For this purpose, we collected large volumes of Twitter accounts of party leaders and MEP candidates between 10th of May and 26th of July based on content analysis of tweets based on hashtags while using an innovative network analysis tool known as MediaWatch.io (https://mediawatch.io/). According to our findings, one of the highest percentage (64.6%) of automated “bot” accounts during 2019 European election campaigns was in Greece. In general terms, political bots aim to proliferation of misinformation on social media. Targeting voters is a way that it can be achieved contribute to social media manipulation. We found that political parties and individual politicians create and promote purposeful content on Twitter using algorithmic tools. Based on this analysis, online political advertising play an important role to the process of spreading misinformation during elections campaigns. Overall, inauthentic accounts and social media algorithms are being used to manipulate political behavior and public opinion.Keywords: artificial intelligence tools, human-bot interactions, political manipulation, social networking, troll factories
Procedia PDF Downloads 1381229 Reinforcement Learning For Agile CNC Manufacturing: Optimizing Configurations And Sequencing
Authors: Huan Ting Liao
Abstract:
In a typical manufacturing environment, computer numerical control (CNC) machining is essential for automating production through precise computer-controlled tool operations, significantly enhancing efficiency and ensuring consistent product quality. However, traditional CNC production lines often rely on manual loading and unloading, limiting operational efficiency and scalability. Although automated loading systems have been developed, they frequently lack sufficient intelligence and configuration efficiency, requiring extensive setup adjustments for different products and impacting overall productivity. This research addresses the job shop scheduling problem (JSSP) in CNC machining environments, aiming to minimize total completion time (makespan) and maximize CNC machine utilization. We propose a novel approach using reinforcement learning (RL), specifically the Q-learning algorithm, to optimize scheduling decisions. The study simulates the JSSP, incorporating robotic arm operations, machine processing times, and work order demand allocation to determine optimal processing sequences. The Q-learning algorithm enhances machine utilization by dynamically balancing workloads across CNC machines, adapting to varying job demands and machine states. This approach offers robust solutions for complex manufacturing environments by automating decision-making processes for job assignments. Additionally, we evaluate various layout configurations to identify the most efficient setup. By integrating RL-based scheduling optimization with layout analysis, this research aims to provide a comprehensive solution for improving manufacturing efficiency and productivity in CNC-based job shops. The proposed method's adaptability and automation potential promise significant advancements in tackling dynamic manufacturing challenges.Keywords: job shop scheduling problem, reinforcement learning, operations sequence, layout optimization, q-learning
Procedia PDF Downloads 241228 Viability of Irrigation Water Conservation Practices in the Low Desert of California
Authors: Ali Montazar
Abstract:
California and the Colorado River Basin are facing increasing uncertainty concerning water supplies. The Colorado River is the main source of irrigation water in the low desert of California. Currently, due to an increasing water-use competition and long-term drought at the Colorado River Basin, efficient use of irrigation water is one of the highest conservation priorities in the region. This study aims to present some of current irrigation technologies and management approaches in the low desert and assess the viability and potential of these water management practices. The results of several field experiments are used to assess five water conservation practices of sub-surface drip irrigation, automated surface irrigation, sprinkler irrigation, tail-water recovery system, and deficit irrigation strategy. The preliminary results of several ongoing studies at commercial fields are presented, particularly researches in alfalfa, sugar beets, kliengrass, sunflower, and spinach fields. The findings indicate that all these practices have significant potential to conserve water (an average of 1 ac-ft/ac) and enhance the efficiency of water use (15-25%). Further work is needed to better understand the feasibility of each of these applications and to help maintain profitable and sustainable agricultural production system in the low desert as water and labor costs, and environmental issues increase.Keywords: automated surface irrigation, deficit irrigation, low desert of California, sprinkler irrigation, sub-surface drip irrigation, tail-water recovery system
Procedia PDF Downloads 1581227 Optimization of a Convolutional Neural Network for the Automated Diagnosis of Melanoma
Authors: Kemka C. Ihemelandu, Chukwuemeka U. Ihemelandu
Abstract:
The incidence of melanoma has been increasing rapidly over the past two decades, making melanoma a current public health crisis. Unfortunately, even as screening efforts continue to expand in an effort to ameliorate the death rate from melanoma, there is a need to improve diagnostic accuracy to decrease misdiagnosis. Artificial intelligence (AI) a new frontier in patient care has the ability to improve the accuracy of melanoma diagnosis. Convolutional neural network (CNN) a form of deep neural network, most commonly applied to analyze visual imagery, has been shown to outperform the human brain in pattern recognition. However, there are noted limitations with the accuracy of the CNN models. Our aim in this study was the optimization of convolutional neural network algorithms for the automated diagnosis of melanoma. We hypothesized that Optimal selection of the momentum and batch hyperparameter increases model accuracy. Our most successful model developed during this study, showed that optimal selection of momentum of 0.25, batch size of 2, led to a superior performance and a faster model training time, with an accuracy of ~ 83% after nine hours of training. We did notice a lack of diversity in the dataset used, with a noted class imbalance favoring lighter vs. darker skin tone. Training set image transformations did not result in a superior model performance in our study.Keywords: melanoma, convolutional neural network, momentum, batch hyperparameter
Procedia PDF Downloads 1011226 Sampling and Chemical Characterization of Particulate Matter in a Platinum Mine
Authors: Juergen Orasche, Vesta Kohlmeier, George C. Dragan, Gert Jakobi, Patricia Forbes, Ralf Zimmermann
Abstract:
Underground mining poses a difficult environment for both man and machines. At more than 1000 meters underneath the surface of the earth, ores and other mineral resources are still gained by conventional and motorised mining. Adding to the hazards caused by blasting and stone-chipping, the working conditions are best described by the high temperatures of 35-40°C and high humidity, at low air exchange rates. Separate ventilation shafts lead fresh air into a mine and others lead expended air back to the surface. This is essential for humans and machines working deep underground. Nevertheless, mines are widely ramified. Thus the air flow rate at the far end of a tunnel is sensed to be close to zero. In recent years, conventional mining was supplemented by mining with heavy diesel machines. These very flat machines called Load Haul Dump (LHD) vehicles accelerate and ease work in areas favourable for heavy machines. On the other hand, they emit non-filtered diesel exhaust, which constitutes an occupational hazard for the miners. Combined with a low air exchange, high humidity and inorganic dust from the mining it leads to 'black smog' underneath the earth. This work focuses on the air quality in mines employing LHDs. Therefore we performed personal sampling (samplers worn by miners during their work), stationary sampling and aethalometer (Microaeth MA200, Aethlabs) measurements in a platinum mine in around 1000 meters under the earth’s surface. We compared areas of high diesel exhaust emission with areas of conventional mining where no diesel machines were operated. For a better assessment of health risks caused by air pollution we applied a separated gas-/particle-sampling tool (or system), with first denuder section collecting intermediate VOCs. These multi-channel silicone rubber denuders are able to trap IVOCs while allowing particles ranged from 10 nm to 1 µm in diameter to be transmitted with an efficiency of nearly 100%. The second section is represented by a quartz fibre filter collecting particles and adsorbed semi-volatile organic compounds (SVOC). The third part is a graphitized carbon black adsorber – collecting the SVOCs that evaporate from the filter. The compounds collected on these three sections were analyzed in our labs with different thermal desorption techniques coupled with gas chromatography and mass spectrometry (GC-MS). VOCs and IVOCs were measured with a Shimadzu Thermal Desorption Unit (TD20, Shimadzu, Japan) coupled to a GCMS-System QP 2010 Ultra with a quadrupole mass spectrometer (Shimadzu). The GC was equipped with a 30m, BP-20 wax column (0.25mm ID, 0.25µm film) from SGE (Australia). Filters were analyzed with In-situ derivatization thermal desorption gas chromatography time-of-flight-mass spectrometry (IDTD-GC-TOF-MS). The IDTD unit is a modified GL sciences Optic 3 system (GL Sciences, Netherlands). The results showed black carbon concentrations measured with the portable aethalometers up to several mg per m³. The organic chemistry was dominated by very high concentrations of alkanes. Typical diesel engine exhaust markers like alkylated polycyclic aromatic hydrocarbons were detected as well as typical lubrication oil markers like hopanes.Keywords: diesel emission, personal sampling, aethalometer, mining
Procedia PDF Downloads 1571225 Stability Analysis of DFIG Stator Powers Control Based on Sliding Mode Approach
Authors: Abdelhak Djoudi, Hachemi Chekireb, El Madjid Berkouk
Abstract:
The doubly fed induction generator (DFIG) received recently an important consideration in medium and high power wind energy conversion systems integration, due to its advantages compared to other generators types. The stator power sliding mode control (SPSMC) proves a great efficiency judge against other control laws and schemes. In the SPSMC laws elaborated by several authors, only the slide surface tracking conditions are elaborated using Lyapunov functions, and the boundedness of the DFIG states is never treated. Some works have validated theirs approaches by experiments results in the case of specified machines, but these verifications stay insufficient to generalize to other machines range. Adding to this argument, the DFIG states boundedness demonstration is widely suggested in goal to ensure that in the application of the SPSMC, the states evaluates within theirs tolerable bounds. Our objective in the present paper is to highlight the efficiency of the SPSMC by stability analysis. The boundedness of the DFIG states such as the stator current and rotor flux is discussed. Moreover, the states trajectories are finding using analytical proves taking into consideration the SPSMC gains.Keywords: Doubly Fed Induction Generator (DFIG), Stator Powers Sliding Mode Control (SPSMC), lyapunov function, stability, states boundedness, trajectories mathematical proves
Procedia PDF Downloads 4001224 Field Production Data Collection, Analysis and Reporting Using Automated System
Authors: Amir AlAmeeri, Mohamed Ibrahim
Abstract:
Various data points are constantly being measured in the production system, and due to the nature of the wells, these data points, such as pressure, temperature, water cut, etc.., fluctuations are constant, which requires high frequency monitoring and collection. It is a very difficult task to analyze these parameters manually using spreadsheets and email. An automated system greatly enhances efficiency, reduce errors, the need for constant emails which take up disk space, and frees up time for the operator to perform other critical tasks. Various production data is being recorded in an oil field, and this huge volume of data can be seen as irrelevant to some, especially when viewed on its own with no context. In order to fully utilize all this information, it needs to be properly collected, verified and stored in one common place and analyzed for surveillance and monitoring purposes. This paper describes how data is recorded by different parties and departments in the field, and verified numerous times as it is being loaded into a repository. Once it is loaded, a final check is done before being entered into a production monitoring system. Once all this is collected, various calculations are performed to report allocated production. Calculated production data is used to report field production automatically. It is also used to monitor well and surface facility performance. Engineers can use this for their studies and analyses to ensure field is performing as it should be, predict and forecast production, and monitor any changes in wells that could affect field performance.Keywords: automation, oil production, Cheleken, exploration and production (E&P), Caspian Sea, allocation, forecast
Procedia PDF Downloads 1561223 Friend or Foe: Decoding the Legal Challenges Posed by Artificial Intellegence in the Era of Intellectual Property
Authors: Latika Choudhary
Abstract:
“The potential benefits of Artificial Intelligence are huge, So are the dangers.” - Dave Water. Artificial intelligence is one of the facet of Information technology domain which despite several attempts does not have a clear definition or ambit. However it can be understood as technology to solve problems via automated decisions and predictions. Artificial intelligence is essentially an algorithm based technology which analyses the large amounts of data and then solves problems by detecting useful patterns. Owing to its automated feature it will not be wrong to say that humans & AI have more utility than humans alone or computers alone.1 For many decades AI experienced enthusiasm as well as setbacks, yet it has today become part and parcel of our everyday life, making it convenient or at times problematic. AI and related technology encompass Intellectual Property in multiple ways, the most important being AI technology for management of Intellectual Property, IP for protecting AI and IP as a hindrance to the transparency of AI systems. Thus the relationship between the two is of reciprocity as IP influences AI and vice versa. While AI is a recent concept, the IP laws for protection or even dealing with its challenges are relatively older, raising the need for revision to keep up with the pace of technological advancements. This paper will analyze the relationship between AI and IP to determine how beneficial or conflictual the same is, address how the old concepts of IP are being stretched to its maximum limits so as to accommodate the unwanted consequences of the Artificial Intelligence and propose ways to mitigate the situation so that AI becomes the friend it is and not turn into a potential foe it appears to be.Keywords: intellectual property rights, information technology, algorithm, artificial intelligence
Procedia PDF Downloads 87