Search results for: automated cleaning machines
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1806

Search results for: automated cleaning machines

1056 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings

Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey

Abstract:

Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.

Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing

Procedia PDF Downloads 143
1055 Weight Comparison of Oil and Dry Type Distribution Transformers

Authors: Murat Toren, Mehmet Çelebi

Abstract:

Reducing the weight of transformers while providing good performance, cost reduction and increased efficiency is important. Weight is one of the most significant factors in all electrical machines, and as such, many transformer design parameters are related to weight calculations. This study presents a comparison of the weight of oil type transformers and dry type transformer weight. Oil type transformers are mainly used in industry; however, dry type transformers are becoming more widespread in recent years. MATLAB is typically used for designing transformers and design parameters (rated voltages, core loss, etc.) along with design in ANSYS Maxwell. Similar to other studies, this study presented that the dry type transformer option is limited. Moreover, the commonly-used 50 kVA distribution transformers in the industry are oil type and dry type transformers are designed and considered in terms of weight. Currently, the preference for low-cost oil-type transformers would change if costs for dry-type transformer were more competitive. The aim of this study was to compare the weight of transformers, which is a substantial cost factor, and to provide an evaluation about increasing the use of dry type transformers.

Keywords: weight, optimization, oil-type transformers, dry-type transformers

Procedia PDF Downloads 341
1054 A Survey on Ambient Intelligence in Agricultural Technology

Authors: C. Angel, S. Asha

Abstract:

Despite the advances made in various new technologies, application of these technologies for agriculture still remains a formidable task, as it involves integration of diverse domains for monitoring the different process involved in agricultural management. Advances in ambient intelligence technology represents one of the most powerful technology for increasing the yield of agricultural crops and to mitigate the impact of water scarcity, climatic change and methods for managing pests, weeds, and diseases. This paper proposes a GPS-assisted, machine to machine solutions that combine information collected by multiple sensors for the automated management of paddy crops. To maintain the economic viability of paddy cultivation, the various techniques used in agriculture are discussed and a novel system which uses ambient intelligence technique is proposed in this paper. The ambient intelligence based agricultural system gives a great scope.

Keywords: ambient intelligence, agricultural technology, smart agriculture, precise farming

Procedia PDF Downloads 593
1053 Model Based Optimization of Workplace Ergonomics by Workpiece and Resource Positioning

Authors: Edward Hage, Pieter Lietaert, Gabriel Abedrabbo

Abstract:

Musculoskeletal disorders are an important category of work-related diseases. They are often caused by working in non-ergonomic postures and are preventable with proper workplace design, possibly including human-machine collaboration. This paper presents a methodology and a supporting software prototype to design a simple assembly cell with minimal ergonomic risk. The methodology helps to determine the optimal position and orientation of workpieces and workplace resources for specific operator assembly actions. The methodology is tested on an industrial use case: a collaborative robot (cobot) assisted assembly of a clamping device. It is shown that the automated methodology results in a workplace design with significantly reduced ergonomic risk to the operator compared to a manual design of the cell.

Keywords: ergonomics optimization, design for ergonomics, workplace design, pose generation

Procedia PDF Downloads 114
1052 Conceptual Design of Gravity Anchor Focusing on Anchor Towing and Lowering

Authors: Vinay Kumar Vanjakula, Frank Adam, Nils Goseberg

Abstract:

Wind power is one of the leading renewable energy generation methods. Due to abundant higher wind speeds far away from shore, the construction of offshore wind turbines began in the last decades. However, installation of offshore foundation-based (monopiles) wind turbines in deep waters are often associated with technical and financial challenges. To overcome such challenges, the concept of floating wind turbines is expanded as the basis from the oil and gas industry. The unfolding of Universal heavyweight gravity anchor (UGA) for floating based foundation for floating Tension Leg Platform (TLP) sub-structures is developed in this research work. It is funded by the German Federal Ministry of Education and Research) for a three-year (2019-2022) research program called “Offshore Wind Solutions Plus (OWSplus) - Floating Offshore Wind Solutions Mecklenburg-Vorpommern.” It’s a group consists of German institutions (Universities, laboratories, and consulting companies). The part of the project is focused on the numerical modeling of gravity anchor that involves to analyze and solve fluid flow problems. Compared to gravity-based torpedo anchors, these UGA will be towed and lowered via controlled machines (tug boats) at lower speeds. This kind of installation of UGA are new to the offshore wind industry, particularly for TLP, and very few research works have been carried out in recent years. Conventional methods for transporting the anchor requires a large transportation crane vessel which involves a greater cost. This conceptual UGA anchors consists of ballasting chambers which utilizes the concept of buoyancy forces; the inside chambers are filled with the required amount of water in a way that they can float on the water for towing. After reaching the installation site, those chambers are ballasted with water for lowering. After it’s lifetime, these UGA can be unballasted (for erection or replacement) results in self-rising to the sea surface; buoyancy chambers give an advantage for using an UGA without the need of heavy machinery. However, while lowering/rising the UGA towards/away from the seabed, it experiences difficult, harsh marine environments due to the interaction of waves and currents. This leads to drifting of the anchor from the desired installation position and damage to the lowering machines. To overcome such harsh environments problems, a numerical model is built to investigate the influences of different outer contours and other fluid governing shapes that can be installed on the UGA to overcome the turbulence and drifting. The presentation will highlight the importance of the Computational Fluid Dynamics (CFD) numerical model in OpenFOAM, which is open-source programming software.

Keywords: anchor lowering, towing, waves, currrents, computational fluid dynamics

Procedia PDF Downloads 158
1051 Performance Analysis of Artificial Neural Network Based Land Cover Classification

Authors: Najam Aziz, Nasru Minallah, Ahmad Junaid, Kashaf Gul

Abstract:

Landcover classification using automated classification techniques, while employing remotely sensed multi-spectral imagery, is one of the promising areas of research. Different land conditions at different time are captured through satellite and monitored by applying different classification algorithms in specific environment. In this paper, a SPOT-5 image provided by SUPARCO has been studied and classified in Environment for Visual Interpretation (ENVI), a tool widely used in remote sensing. Then, Artificial Neural Network (ANN) classification technique is used to detect the land cover changes in Abbottabad district. Obtained results are compared with a pixel based Distance classifier. The results show that ANN gives the better overall accuracy of 99.20% and Kappa coefficient value of 0.98 over the Mahalanobis Distance Classifier.

Keywords: landcover classification, artificial neural network, remote sensing, SPOT 5

Procedia PDF Downloads 529
1050 Using AI for Analysing Political Leaders

Authors: Shuai Zhao, Shalendra D. Sharma, Jin Xu

Abstract:

This research uses advanced machine learning models to learn a number of hypotheses regarding political executives. Specifically, it analyses the impact these powerful leaders have on economic growth by using leaders’ data from the Archigos database from 1835 to the end of 2015. The data is processed by the AutoGluon, which was developed by Amazon. Automated Machine Learning (AutoML) and AutoGluon can automatically extract features from the data and then use multiple classifiers to train the data. Use a linear regression model and classification model to establish the relationship between leaders and economic growth (GDP per capita growth), and to clarify the relationship between their characteristics and economic growth from a machine learning perspective. Our work may show as a model or signal for collaboration between the fields of statistics and artificial intelligence (AI) that can light up the way for political researchers and economists.

Keywords: comparative politics, political executives, leaders’ characteristics, artificial intelligence

Procedia PDF Downloads 78
1049 [Keynote Talk]: Evidence Fusion in Decision Making

Authors: Mohammad Abdullah-Al-Wadud

Abstract:

In the current era of automation and artificial intelligence, different systems have been increasingly keeping on depending on decision-making capabilities of machines. Such systems/applications may range from simple classifiers to sophisticated surveillance systems based on traditional sensors and related equipment which are becoming more common in the internet of things (IoT) paradigm. However, the available data for such problems are usually imprecise and incomplete, which leads to uncertainty in decisions made based on traditional probability-based classifiers. This requires a robust fusion framework to combine the available information sources with some degree of certainty. The theory of evidence can provide with such a method for combining evidence from different (may be unreliable) sources/observers. This talk will address the employment of the Dempster-Shafer Theory of evidence in some practical applications.

Keywords: decision making, dempster-shafer theory, evidence fusion, incomplete data, uncertainty

Procedia PDF Downloads 415
1048 Retraction Free Motion Approach and Its Application in Automated Robotic Edge Finishing and Inspection Processes

Authors: M. Nemer, E. I. Konukseven

Abstract:

In this paper, a motion generation algorithm for a six Degrees of Freedom (DoF) robotic hand in a static environment is presented. The purpose of developing this method is to be used in the path generation of the end-effector for edge finishing and inspection processes by utilizing the CAD model of the considered workpiece. Nonetheless, the proposed algorithm may be extended to be applicable for other similar manufacturing processes. A software package programmed in the application programming interface (API) of SolidWorks generates tool path data for the robot. The proposed method significantly simplifies the given problem, resulting in a reduction in the CPU time needed to generate the path, and offers an efficient overall solution. The ABB IRB2000 robot is chosen for executing the generated tool path.

Keywords: CAD-based tools, edge deburring, edge scanning, offline programming, path generation

Procedia PDF Downloads 280
1047 Exergetic and Life Cycle Assessment Analyses of Integrated Biowaste Gasification-Combustion System: A Study Case

Authors: Anabel Fernandez, Leandro Rodriguez-Ortiz, Rosa RodríGuez

Abstract:

Due to the negative impact of fossil fuels, renewable energies are promising sources to limit global temperature rise and damage to the environment. Also, the development of technology is focused on obtaining energetic products from renewable sources. In this study, a thermodynamic model including Exergy balance and a subsequent Life Cycle Assessment (LCA) were carried out for four subsystems of the integrated gasification-combustion of pinewood. Results of exergy analysis and LCA showed the process feasibility in terms of exergy efficiency and global energy efficiency of the life cycle (GEELC). Moreover, the energy return on investment (EROI) index was calculated. The global exergy efficiency resulted in 67 %. For pretreatment, reaction, cleaning, and electric generation subsystems, the results were 85, 59, 87, and 29 %, respectively. Results of LCA indicated that the emissions from the electric generation caused the most damage to the atmosphere, water, and soil. GEELC resulted in 31.09 % for the global process. This result suggested the environmental feasibility of an integrated gasification-combustion system. EROI resulted in 3.15, which determinates the sustainability of the process.

Keywords: exergy analysis, life cycle assessment (LCA), renewability, sustainability

Procedia PDF Downloads 201
1046 Mobile Agents-Based Framework for Dynamic Resource Allocation in Cloud Computing

Authors: Safia Rabaaoui, Héla Hachicha, Ezzeddine Zagrouba

Abstract:

Nowadays, cloud computing is becoming the more popular technology to various companies and consumers, which benefit from its increased efficiency, cost optimization, data security, unlimited storage capacity, etc. One of the biggest challenges of cloud computing is resource allocation. Its efficiency directly influences the performance of the whole cloud environment. Finding an effective method to address these critical issues and increase cloud performance was necessary. This paper proposes a mobile agents-based framework for dynamic resource allocation in cloud computing to minimize both the cost of using virtual machines and the makespan. Furthermore, its impact on the best response time and power consumption has been studied. The simulation showed that our method gave better results than here.

Keywords: cloud computing, multi-agent system, mobile agent, dynamic resource allocation, cost, makespan

Procedia PDF Downloads 88
1045 Dry Relaxation Shrinkage Prediction of Bordeaux Fiber Using a Feed Forward Neural

Authors: Baeza S. Roberto

Abstract:

The knitted fabric suffers a deformation in its dimensions due to stretching and tension factors, transverse and longitudinal respectively, during the process in rectilinear knitting machines so it performs a dry relaxation shrinkage procedure and thermal action of prefixed to obtain stable conditions in the knitting. This paper presents a dry relaxation shrinkage prediction of Bordeaux fiber using a feed forward neural network and linear regression models. Six operational alternatives of shrinkage were predicted. A comparison of the results was performed finding neural network models with higher levels of explanation of the variability and prediction. The presence of different reposes are included. The models were obtained through a neural toolbox of Matlab and Minitab software with real data in a knitting company of Southern Guanajuato. The results allow predicting dry relaxation shrinkage of each alternative operation.

Keywords: neural network, dry relaxation, knitting, linear regression

Procedia PDF Downloads 569
1044 3D Scanning Documentation and X-Ray Radiography Examination for Ancient Egyptian Canopic Jar

Authors: Abdelrahman Mohamed Abdelrahman

Abstract:

Canopic jars are one of the vessels of funerary nature used by the ancient Egyptian in mummification process that were used to save the viscera of the mummified body after being extracted from the body and treated. Canopic jars are made of several types of materials like Limestone, Alabaster, and Pottery. The studied canopic jar dates back to Late period, located in the Grand Egyptian Museum (GEM), Giza, Egypt. This jar carved from limestone with carved hieroglyphic inscriptions, and it filled and closed by mortar from inside. Some aspects of damage appeared in the jar, such as dust, dirts, classification, wide crack, weakness of limestone. In this study, we used documentation and investigation modern techniques to document and examine the jar. 3D scanning and X-ray Radiography imaging used in applied study. X-ray imaging showed that the mortar was placed at a time when the jar contained probably viscera where the mortar appeared that not reach up to the base of the inner jar. Through the three-dimensional photography, the jar was documented, and we have 3D model of the jar, and now we have the ability through the computer to see any part of the jar in all its details. After that, conservation procedures have been applied with high accuracy to conserve the jar, including mechanical, wet, and chemical cleaning, filling wide crack in the body of the jar using mortar consisting of calcium carbonate powder mixing with primal E330 S, and consolidation, so the limestone became strong after using paraloid B72 2% concentrate as a consolidate material.

Keywords: vessel, limestone, canopic jar, mortar, 3D scanning, X-ray radiography

Procedia PDF Downloads 63
1043 The Mechanical Properties of a Small-Size Seismic Isolation Rubber Bearing for Bridges

Authors: Yi F. Wu, Ai Q. Li, Hao Wang

Abstract:

Taking a novel type of bridge bearings with the diameter being 100mm as an example, the theoretical analysis, the experimental research as well as the numerical simulation of the bearing were conducted. Since the normal compression-shear machines cannot be applied to the small-size bearing, an improved device to test the properties of the bearing was proposed and fabricated. Besides, the simulation of the bearing was conducted on the basis of the explicit finite element software ANSYS/LS-DYNA, and some parameters of the bearing are modified in the finite element model to effectively reduce the computation cost. Results show that all the research methods are capable of revealing the fundamental properties of the small-size bearings, and a combined use of these methods can better catch both the integral properties and the inner detailed mechanical behaviors of the bearing.

Keywords: ANSYS/LS-DYNA, compression shear, contact analysis, explicit algorithm, small-size

Procedia PDF Downloads 169
1042 Development on the Modeling Driven Architecture

Authors: Sahar Shahsavaripour Ghazanfarpour

Abstract:

As our daily life depends on quality of built services by systems and using devices in our environment; so education and model of software′s quality will be so important. By daily growth in software′s systems and using them so much, progressing process and requirements′ evaluation in primary level of progress especially architecture level in software get more important. Modern driver architecture changes an in dependent model of a level into some specific models that their purpose is reducing number of software changes into an executive model. Process of designing software engineering is mid-automated. The needed quality attribute in designing architecture and quality attribute in representation are in architecture models. The main problem is the relationship between needs, and elements in some aspect with implicit models and input sources in process. It’s because there is no detection ability. The MART profile is use to describe real-time properties and perform plat form modeling.

Keywords: MDA, DW, OMG, UML, AKB, software architecture, ontology, evaluation

Procedia PDF Downloads 485
1041 Comparison of Machine Learning and Deep Learning Algorithms for Automatic Classification of 80 Different Pollen Species

Authors: Endrick Barnacin, Jean-Luc Henry, Jimmy Nagau, Jack Molinie

Abstract:

Palynology is a field of interest in many disciplines due to its multiple applications: chronological dating, climatology, allergy treatment, and honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time consuming task that requires the intervention of experts in the field, which are becoming increasingly rare due to economic and social conditions. That is why the need for automation of this task is urgent. A lot of studies have investigated the subject using different standard image processing descriptors and sometimes hand-crafted ones.In this work, we make a comparative study between classical feature extraction methods (Shape, GLCM, LBP, and others) and Deep Learning (CNN, Autoencoders, Transfer Learning) to perform a recognition task over 80 regional pollen species. It has been found that the use of Transfer Learning seems to be more precise than the other approaches

Keywords: pollens identification, features extraction, pollens classification, automated palynology

Procedia PDF Downloads 124
1040 Authentic Connection between the Deity and the Individual Human Being Is Vital for Psychological, Biological, and Social Health

Authors: Sukran Karatas

Abstract:

Authentic energy network interrelations between the Creator and the creations as well as from creations to creations are the most important points for the worlds of physics and metaphysic to unite together and work in harmony, both within human beings, on the other hand, have the ability to choose their own life style voluntarily. However, it includes the automated involuntary spirit, soul and body working systems together with the voluntary actions, which involve personal, cultural and universal, rational or irrational variable values. Therefore, it is necessary for human beings to know the methods of existing authentic energy network connections to be able to communicate correlate and accommodate the physical and metaphysical entities as a proper functioning unity; this is essential for complete human psychological, biological and social well-being. Authentic knowledge is necessary for human beings to verify the position of self within self and with others to regulate conscious and voluntary actions accordingly in order to prevent oppressions and frictions within self and between self and others. Unfortunately, the absence of genuine individual and universal basic knowledge about how to establish an authentic energy network connection within self, with the deity and the environment is the most problematic issue even in the twenty-first century. The second most problematic issue is how to maintain freedom, equality and justice among human beings during these strictly interwoven network connections, which naturally involve physical, metaphysical and behavioral actions of the self and the others. The third and probably the most complicated problem is the scientific identification and the authentication of the deity. This not only provides the whole power and control over the choosers to set their life orders but also to establish perfect physical and metaphysical links as fully coordinated functional energy network. This thus indicates that choosing an authentic deity is the key-point that influences automated, emotional, and behavioral actions altogether, which shapes human perception, personal actions, and life orders. Therefore, we will be considering the existing ‘four types of energy wave end boundary behaviors’, comprising, free end, fixed end boundary behaviors, as well as boundary behaviors from denser medium to less dense medium and from less dense medium to denser medium. Consequently, this article aims to demonstrate that the authentication and the choice of deity has an important effect on individual psychological, biological and social health. It is hoped that it will encourage new researches in the field of authentic energy network connections to establish the best position and the most correct interrelation connections with self and others without violating the authorized orders and the borders of one another to live happier and healthier lives together. In addition, the book ‘Deity and Freedom, Equality, Justice in History, Philosophy, Science’ has more detailed information for those interested in this subject.

Keywords: deity, energy network, power, freedom, equality, justice, happiness, sadness, hope, fear, psychology, biology, sociology

Procedia PDF Downloads 339
1039 A Machine Learning Approach to Detecting Evasive PDF Malware

Authors: Vareesha Masood, Ammara Gul, Nabeeha Areej, Muhammad Asif Masood, Hamna Imran

Abstract:

The universal use of PDF files has prompted hackers to use them for malicious intent by hiding malicious codes in their victim’s PDF machines. Machine learning has proven to be the most efficient in identifying benign files and detecting files with PDF malware. This paper has proposed an approach using a decision tree classifier with parameters. A modern, inclusive dataset CIC-Evasive-PDFMal2022, produced by Lockheed Martin’s Cyber Security wing is used. It is one of the most reliable datasets to use in this field. We designed a PDF malware detection system that achieved 99.2%. Comparing the suggested model to other cutting-edge models in the same study field, it has a great performance in detecting PDF malware. Accordingly, we provide the fastest, most reliable, and most efficient PDF Malware detection approach in this paper.

Keywords: PDF, PDF malware, decision tree classifier, random forest classifier

Procedia PDF Downloads 80
1038 Automated 3D Segmentation System for Detecting Tumor and Its Heterogeneity in Patients with High Grade Ovarian Epithelial Cancer

Authors: Dimitrios Binas, Marianna Konidari, Charis Bourgioti, Lia Angela Moulopoulou, Theodore Economopoulos, George Matsopoulos

Abstract:

High grade ovarian epithelial cancer (OEC) is fatal gynecological cancer and the poor prognosis of this entity is closely related to considerable intratumoral genetic heterogeneity. By examining imaging data, it is possible to assess the heterogeneity of tumorous tissue. This study proposes a methodology for aligning, segmenting and finally visualizing information from various magnetic resonance imaging series in order to construct 3D models of heterogeneity maps from the same tumor in OEC patients. The proposed system may be used as an adjunct digital tool by health professionals for personalized medicine, as it allows for an easy visual assessment of the heterogeneity of the examined tumor.

Keywords: image segmentation, ovarian epithelial cancer, quantitative characteristics, image registration, tumor visualization

Procedia PDF Downloads 195
1037 A Review of In-Vehicle Network for Cloud Connected Vehicle

Authors: Hanbhin Ryu, Ilkwon Yun

Abstract:

Automotive industry targets to provide an improvement in safety and convenience through realizing fully autonomous vehicle. For partially realizing fully automated driving, Current vehicles already feature varieties of advanced driver assistance system (ADAS) for safety and infotainment systems for the driver’s convenience. This paper presents Cloud Connected Vehicle (CCV) which connected vehicles with cloud data center via the access network to control the vehicle for achieving next autonomous driving form and describes its features. This paper also describes the shortcoming of the existing In-Vehicle Network (IVN) to be a next generation IVN of CCV and organize the 802.3 Ethernet, the next generation of IVN, related research issue to verify the feasibility of using Ethernet. At last, this paper refers to additional considerations to adopting Ethernet-based IVN for CCV.

Keywords: autonomous vehicle, cloud connected vehicle, ethernet, in-vehicle network

Procedia PDF Downloads 467
1036 Condition Monitoring for Twin-Fluid Nozzles with Internal Mixing

Authors: C. Lanzerstorfer

Abstract:

Liquid sprays of water are frequently used in air pollution control for gas cooling purposes and for gas cleaning. Twin-fluid nozzles with internal mixing are often used for these purposes because of the small size of the drops produced. In these nozzles the liquid is dispersed by compressed air or another pressurized gas. In high efficiency scrubbers for particle separation, several nozzles are operated in parallel because of the size of the cross section. In such scrubbers, the scrubbing water has to be re-circulated. Precipitation of some solid material can occur in the liquid circuit, caused by chemical reactions. When such precipitations are detached from the place of formation, they can partly or totally block the liquid flow to a nozzle. Due to the resulting unbalanced supply of the nozzles with water and gas, the efficiency of separation decreases. Thus, the nozzles have to be cleaned if a certain fraction of blockages is reached. The aim of this study was to provide a tool for continuously monitoring the status of the nozzles of a scrubber based on the available operation data (water flow, air flow, water pressure and air pressure). The difference between the air pressure and the water pressure is not well suited for this purpose, because the difference is quite small and therefore very exact calibration of the pressure measurement would be required. Therefore, an equation for the reference air flow of a nozzle at the actual water flow and operation pressure was derived. This flow can be compared with the actual air flow for assessment of the status of the nozzles.

Keywords: condition monitoring, dual flow nozzles, flow equation, operation data

Procedia PDF Downloads 260
1035 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data

Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder

Abstract:

Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.

Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods

Procedia PDF Downloads 248
1034 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 118
1033 A High-Level Co-Evolutionary Hybrid Algorithm for the Multi-Objective Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a hybrid distributed algorithm has been suggested for the multi-objective job shop scheduling problem. Many new approaches are used at design steps of the distributed algorithm. Co-evolutionary structure of the algorithm and competition between different communicated hybrid algorithms, which are executed simultaneously, causes to efficient search. Using several machines for distributing the algorithms, at the iteration and solution levels, increases computational speed. The proposed algorithm is able to find the Pareto solutions of the big problems in shorter time than other algorithm in the literature. Apache Spark and Hadoop platforms have been used for the distribution of the algorithm. The suggested algorithm and implementations have been compared with results of the successful algorithms in the literature. Results prove the efficiency and high speed of the algorithm.

Keywords: distributed algorithms, Apache Spark, Hadoop, job shop scheduling, multi-objective optimization

Procedia PDF Downloads 354
1032 Efficient Utilization of Unmanned Aerial Vehicle (UAV) for Fishing through Surveillance for Fishermen

Authors: T. Ahilan, V. Aswin Adityan, S. Kailash

Abstract:

UAV’s are small remote operated or automated aerial surveillance systems without a human pilot aboard. UAV’s generally finds its use in military and special operation application, a recent growing trend in UAV’s finds its application in several civil and non military works such as inspection of power or pipelines. The objective of this paper is the augmentation of a UAV in order to replace the existing expensive sonar (sound navigation and ranging) based equipment amongst small scale fisherman, for whom access to sonar equipment are restricted due to limited economic resources. The surveillance equipment’s present in the UAV will relay data and GPS location onto a receiver on the fishing boat using RF signals, using which the location of the schools of fishes can be found. In addition to this, an emergency beacon system is present for rescue operations and drone recovery.

Keywords: UAV, Surveillance, RF signals, fishing, sonar, GPS, video stream, school of fish

Procedia PDF Downloads 448
1031 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients

Authors: Karina Zaccari, Ernesto Cordeiro Marujo

Abstract:

This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.

Keywords: machine learning, medical diagnosis, meningitis detection, pediatric research

Procedia PDF Downloads 140
1030 Trabecular Bone Radiograph Characterization Using Fractal, Multifractal Analysis and SVM Classifier

Authors: I. Slim, H. Akkari, A. Ben Abdallah, I. Bhouri, M. Hedi Bedoui

Abstract:

Osteoporosis is a common disease characterized by low bone mass and deterioration of micro-architectural bone tissue, which provokes an increased risk of fracture. This work treats the texture characterization of trabecular bone radiographs. The aim was to analyze according to clinical research a group of 174 subjects: 87 osteoporotic patients (OP) with various bone fracture types and 87 control cases (CC). To characterize osteoporosis, Fractal and MultiFractal (MF) methods were applied to images for features (attributes) extraction. In order to improve the results, a new method of MF spectrum based on the q-stucture function calculation was proposed and a combination of Fractal and MF attributes was used. The Support Vector Machines (SVM) was applied as a classifier to distinguish between OP patients and CC subjects. The features fusion (fractal and MF) allowed a good discrimination between the two groups with an accuracy rate of 96.22%.

Keywords: fractal, micro-architecture analysis, multifractal, osteoporosis, SVM

Procedia PDF Downloads 383
1029 Application of Statistical Linearized Models for Investigations of Digital Dynamic Pulse-Frequency Control Systems

Authors: B. H. Aitchanov, Sh. K. Aitchanova, O. A. Baimuratov

Abstract:

This paper is focused on dynamic pulse-frequency modulation (DPFM) control systems. Currently, the control law based on DPFM control signals is widely used in direct digital control subsystems introduced in the automated control systems of technological processes. Statistical analysis of automatic control systems is reduced to its construction of functional relationships between the statistical characteristics of the errors processes and input processes. Structural and dynamic Volterra models of digital pulse-frequency control systems can be used to develop methods for generating the dependencies, differing accuracy, requiring the amount of information about the statistical characteristics of input processes and computing labor intensity of their use.

Keywords: digital dynamic pulse-frequency control systems, dynamic pulse-frequency modulation, control object, discrete filter, impulse device, microcontroller

Procedia PDF Downloads 479
1028 Lattice Network Model for Calculation of Eddy Current Losses in a Solid Permanent Magnet

Authors: Jan Schmidt, Pierre Köhring

Abstract:

Permanently excited machines are set up with magnets that are made of highly energetic magnetic materials. Inherently, the permanent magnets warm up while the machine is operating. With an increasing temperature, the electromotive force and hence the degree of efficiency decrease. The reasons for this are slot harmonics and distorted armature currents arising from frequency inverter operation. To prevent or avoid demagnetizing of the permanent magnets it is necessary to ensure that the magnets do not excessively heat up. Demagnetizations of permanent magnets are irreversible and a breakdown of the electrical machine is inevitable. For the design of an electrical machine, the knowledge of the behavior of heating under operating conditions of the permanent magnet is of crucial importance. Therefore, a calculation model is presented with which the machine designer can easily calculate the eddy current losses in the magnetic material.

Keywords: analytical model, eddy current, losses, lattice network, permanent magnet

Procedia PDF Downloads 414
1027 A Dirty Page Migration Method in Process of Memory Migration Based on Pre-copy Technology

Authors: Kang Zijian, Zhang Tingyu, Burra Venkata Durga Kumar

Abstract:

This article investigates the challenges in memory migration during the live migration of virtual machines. We found three challenges probably existing in pre-copy technology. One of the main challenges is the challenge of downtime migration. Decrease the downtime could promise the normal work for a virtual machine. Although pre-copy technology is greatly decreasing the downtime, we still need to shut down the machine in order to finish the last round of data transfer. This paper provides an optimization scheme for the problems existing in pro-copy technology, mainly the optimization of the dirty page migration mechanism. The typical pre-copy technology copy n-1th’s dirty pages in nth turn. However, our idea is to create a double iteration method to solve this problem.

Keywords: virtual machine, pre-copy technology, memory migration process, downtime, dirty pages migration method

Procedia PDF Downloads 125