Search results for: optimization algorithms
2572 Modeling and Optimizing of Sinker Electric Discharge Machine Process Parameters on AISI 4140 Alloy Steel by Central Composite Rotatable Design Method
Authors: J. Satya Eswari, J. Sekhar Babub, Meena Murmu, Govardhan Bhat
Abstract:
Electrical Discharge Machining (EDM) is an unconventional manufacturing process based on removal of material from a part by means of a series of repeated electrical sparks created by electric pulse generators at short intervals between a electrode tool and the part to be machined emmersed in dielectric fluid. In this paper, a study will be performed on the influence of the factors of peak current, pulse on time, interval time and power supply voltage. The output responses measured were material removal rate (MRR) and surface roughness. Finally, the parameters were optimized for maximum MRR with the desired surface roughness. RSM involves establishing mathematical relations between the design variables and the resulting responses and optimizing the process conditions. RSM is not free from problems when it is applied to multi-factor and multi-response situations. Design of experiments (DOE) technique to select the optimum machining conditions for machining AISI 4140 using EDM. The purpose of this paper is to determine the optimal factors of the electro-discharge machining (EDM) process investigate feasibility of design of experiment techniques. The work pieces used were rectangular plates of AISI 4140 grade steel alloy. The study of optimized settings of key machining factors like pulse on time, gap voltage, flushing pressure, input current and duty cycle on the material removal, surface roughness is been carried out using central composite design. The objective is to maximize the Material removal rate (MRR). Central composite design data is used to develop second order polynomial models with interaction terms. The insignificant coefficients’ are eliminated with these models by using student t test and F test for the goodness of fit. CCD is first used to establish the determine the optimal factors of the electro-discharge machining (EDM) for maximizing the MRR. The responses are further treated through a objective function to establish the same set of key machining factors to satisfy the optimization problem of the electro-discharge machining (EDM) process. The results demonstrate the better performance of CCD data based RSM for optimizing the electro-discharge machining (EDM) process.Keywords: electric discharge machining (EDM), modeling, optimization, CCRD
Procedia PDF Downloads 3442571 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients
Authors: Karina Zaccari, Ernesto Cordeiro Marujo
Abstract:
This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.Keywords: machine learning, medical diagnosis, meningitis detection, pediatric research
Procedia PDF Downloads 1522570 Cognition Technique for Developing a World Music
Authors: Haider Javed Uppal, Javed Yunas Uppal
Abstract:
In today's globalized world, it is necessary to develop a form of music that is able to evoke equal emotional responses among people from diverse cultural backgrounds. Indigenous cultures throughout history have developed their own music cognition, specifically in terms of the connections between music and mood. With the advancements in artificial intelligence technologies, it has become possible to analyze and categorize music features such as timbre, harmony, melody, and rhythm and relate them to the resulting mood effects experienced by listeners. This paper presents a model that utilizes a screenshot translator to convert music from different origins into waveforms, which are then analyzed using machine learning and information retrieval techniques. By connecting these waveforms with Thayer's matrix of moods, a mood classifier has been developed using fuzzy logic algorithms to determine the emotional impact of different types of music on listeners from various cultures.Keywords: cognition, world music, artificial intelligence, Thayer’s matrix
Procedia PDF Downloads 862569 A Deep Reinforcement Learning-Based Secure Framework against Adversarial Attacks in Power System
Authors: Arshia Aflaki, Hadis Karimipour, Anik Islam
Abstract:
Generative Adversarial Attacks (GAAs) threaten critical sectors, ranging from fingerprint recognition to industrial control systems. Existing Deep Learning (DL) algorithms are not robust enough against this kind of cyber-attack. As one of the most critical industries in the world, the power grid is not an exception. In this study, a Deep Reinforcement Learning-based (DRL) framework assisting the DL model to improve the robustness of the model against generative adversarial attacks is proposed. Real-world smart grid stability data, as an IIoT dataset, test our method and improves the classification accuracy of a deep learning model from around 57 percent to 96 percent.Keywords: generative adversarial attack, deep reinforcement learning, deep learning, IIoT, generative adversarial networks, power system
Procedia PDF Downloads 512568 Bioeconomic Modeling for the Sustainable Exploitation of Three Key Marine Species in Morocco
Authors: I .Ait El Harch, K. Outaaoui, Y. El Foutayeni
Abstract:
This study aims to deepen the understanding and optimize fishing activity in Morocco by holistically integrating biological and economic aspects. We develop a biological equilibrium model in which these competing species present their natural growth by logistic equations, taking into account density and competition between them. The integration of human intervention adds a realistic dimension to our model. A company specifically targets the three species, thus influencing population dynamics according to their fishing activities. The aim of this work is to determine the fishing effort that maximizes the company’s profit, taking into account the constraints associated with conserving ecosystem equilibrium.Keywords: bioeconomical modeling, optimization techniques, linear complementarity problem LCP, biological equilibrium, maximizing profits
Procedia PDF Downloads 312567 Scale Up-Mechanochemical Synthesis of High Surface Area Alpha-Alumina
Authors: Sarah Triller, Ferdi Schüth
Abstract:
The challenges encountered in upscaling the mechanochemical synthesis of high surface area α-alumina are investigated in this study. After lab-scale experiments in shaker mills and planetary ball mills, the optimization of reaction parameters of the conversion in the smallest vessel of a scalable mill, named Simoloyer, was developed. Furthermore, the future perspectives by scaling up the conversion in several steps are described. Since abrasion from the steel equipment can be problematic, the process was transferred to a ceramically lined mill, which solved the contamination problem. The recovered alpha-alumina shows a high specific surface area in all investigated scales.Keywords: mechanochemistry, scale-up, ball milling, ceramic lining
Procedia PDF Downloads 712566 Resource Allocation Modeling and Simulation in Border Security Application
Authors: Kai Jin, Hua Li, Qing Song
Abstract:
Homeland security and border safety is an issue for any country. This paper takes the border security of US as an example to discuss the usage and efficiency of simulation tools in the homeland security application. In this study, available resources and different illegal infiltration parameters are defined, including their individual behavior and objective, in order to develop a model that describes border patrol system. A simulation model is created in Arena. This simulation model is used to study the dynamic activities in the border security. Possible factors that may affect the effectiveness of the border patrol system are proposed. Individual and factorial analysis of these factors is conducted and some suggestions are made.Keywords: resource optimization, simulation, modeling, border security
Procedia PDF Downloads 5202565 Optimization for the Hydraulic Clamping System of an Internal Circulation Two-Platen Injection Molding Machine
Authors: Jian Wang, Lu Yang, Jiong Peng
Abstract:
Internal circulation two-platen clamping system for injection molding machine (IMM) has many potential advantages on energy-saving. In order to estimate its properties, experiments in this paper were carried out. Displacement and pressure of the components were measured. In comparison, the model of hydraulic clamping system was established by using AMESim. The related parameters as well as the energy consumption could be calculated. According to the analysis, the hydraulic system was optimized in order to reduce the energy consumption.Keywords: AMESim, energy-saving, injection molding machine, internal circulation
Procedia PDF Downloads 5562564 Fairer Public Benefit in Copyright Law
Authors: Amanda Levendowski
Abstract:
In 1966, a court considered expressly whether a secondary use of copyrighted works served a public benefit. While public benefit has become a subfactor of the fair use doctrine, it remains undefined, uncodified, and undertheorized. After the recent Supreme Court decision in Google v. Oracle, however, it is also unavoidable: the Court stated that “we must take into account the public benefits the copying will likely produce.” Previously, courts invoked public benefit with some predictability in pivotal cases involving novel technologies, from home video recorders to digital libraries to algorithms. A hand-coded dataset of nineteen U.S. technology-related public benefit cases from 1966-2023 reveals five values that emerge from those cases: expression, knowledge, entertainment, competition, and/or efficiency. Forthcoming judicial decisions about the latest novel technology, artificial intelligence (AI), will be shaped by this precedent. However, a series of mid-aughts decisions about algorithms exposed an FU long lurking in fair use: name aside, there is nothing particularly fair about it. Those cases excused invasive, coercive, and biased AI systems as efficient “public benefits” when finding them to be fair use. Many scholars have written about the unfairness of fair use, and this article contributes to those conversations by using a feminist cyberlaw lens to critique the practice of dubbing technologies public benefits without acknowledging, let alone assessing, countervailing public harms. A public benefit that ignores public harm is incomplete. Purported fair uses, particularly those underpinning AI systems, can amplify bias, dis/misinformation, and environmental destruction -harms that are predictable, preventable, and passed over by public benefit presently. This article responds by recalibrating public benefits to better account for these and other public harms. It defines a fairer public benefit and develops a framework for realizing it. The latter poses challenges. In courts, public harm has already happened when matters are litigated, placing a premium on compensation rather than prevention. Congress could codify public benefit, but it is unlikely that Congress could agree upon a satisfactory definition. To further complicate matters, neither judges nor legislators have duties of sociotechnical competency. But lawyers do. Client-centered counseling could facilitate a fairer public benefit if there were a framework for doing so. This article proposes one: FAIRR (pronounced “fairer”), a mnemonic for formalizing purposes, assessing benefits, identifying harms, reconsidering those benefits in light of those harms, and reporting to the client. Inspired by computer science’s threat modeling methodology, FAIRR represents a rigorous, repeatable method for analyzing how infringement liability, public perception, and social progress are affected by public benefits and public harms. By deconstructing the inequities embedded in public benefit as they exist now and developing a fairer alternative for the future, this article helps lawyers shape better technologies.Keywords: intellectual property, copyright, fair use, public benefit, technology, artificial intelligence, bias, disinformation, environmental destruction
Procedia PDF Downloads 72563 The Utilization of Big Data in Knowledge Management Creation
Authors: Daniel Brian Thompson, Subarmaniam Kannan
Abstract:
The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.Keywords: big data, knowledge management, data driven, knowledge creation
Procedia PDF Downloads 1202562 Advancements in Electronic Sensor Technologies for Tea Quality Evaluation
Authors: Raana Babadi Fathipour
Abstract:
Tea, second only to water in global consumption rates, holds a significant place as the beverage of choice for many around the world. The process of fermenting tea leaves plays a crucial role in determining its ultimate quality, traditionally assessed through meticulous observation by tea tasters and laboratory analysis. However, advancements in technology have paved the way for innovative electronic sensing platforms like the electronic nose (e-nose), electronic tongue (e-tongue), and electronic eye (e-eye). These cutting-edge tools, coupled with sophisticated data processing algorithms, not only expedite the assessment of tea's sensory qualities based on consumer preferences but also establish new benchmarks for this esteemed bioactive product to meet burgeoning market demands worldwide. By harnessing intricate data sets derived from electronic signals and deploying multivariate statistical techniques, these technological marvels can enhance accuracy in predicting and distinguishing tea quality with unparalleled precision. In this contemporary exploration, a comprehensive overview is provided of the most recent breakthroughs and viable solutions aimed at addressing forthcoming challenges in the realm of tea analysis. Utilizing bio-mimicking Electronic Sensory Perception systems (ESPs), researchers have developed innovative technologies that enable precise and instantaneous evaluation of the sensory-chemical attributes inherent in tea and its derivatives. These sophisticated sensing mechanisms are adept at deciphering key elements such as aroma, taste, and color profiles, transitioning valuable data into intricate mathematical algorithms for classification purposes. Through their adept capabilities, these cutting-edge devices exhibit remarkable proficiency in discerning various teas with respect to their distinct pricing structures, geographic origins, harvest epochs, fermentation processes, storage durations, quality classifications, and potential adulteration levels. While voltammetric and fluorescent sensor arrays have emerged as promising tools for constructing electronic tongue systems proficient in scrutinizing tea compositions, potentiometric electrodes continue to serve as reliable instruments for meticulously monitoring taste dynamics within different tea varieties. By implementing a feature-level fusion strategy within predictive models, marked enhancements can be achieved regarding efficiency and accuracy levels. Moreover, by establishing intrinsic linkages through pattern recognition methodologies between sensory traits and biochemical makeup found within tea samples, further strides are made toward enhancing our understanding of this venerable beverage's complex nature.Keywords: classifier system, tea, polyphenol, sensor, taste sensor
Procedia PDF Downloads 102561 Using Self Organizing Feature Maps for Classification in RGB Images
Authors: Hassan Masoumi, Ahad Salimi, Nazanin Barhemmat, Babak Gholami
Abstract:
Artificial neural networks have gained a lot of interest as empirical models for their powerful representational capacity, multi input and output mapping characteristics. In fact, most feed-forward networks with nonlinear nodal functions have been proved to be universal approximates. In this paper, we propose a new supervised method for color image classification based on self organizing feature maps (SOFM). This algorithm is based on competitive learning. The method partitions the input space using self-organizing feature maps to introduce the concept of local neighborhoods. Our image classification system entered into RGB image. Experiments with simulated data showed that separability of classes increased when increasing training time. In additional, the result shows proposed algorithms are effective for color image classification.Keywords: classification, SOFM algorithm, neural network, neighborhood, RGB image
Procedia PDF Downloads 4872560 Design and Analysis of Active Rocket Control Systems
Authors: Piotr Jerzy Rugor, Julia Wajoras
Abstract:
The presented work regards a single-stage aerodynamically controlled solid propulsion rocket. Steering a rocket to fly along a predetermined trajectory can be beneficial for minimizing aerodynamic losses and achieved by implementing an active control system on board. In this particular case, a canard configuration has been chosen, although other methods of control have been considered and preemptively analyzed, including non-aerodynamic ones. The objective of this work is to create a system capable of guiding the rocket, focusing on roll stabilization. The paper describes initial analysis of the problem, covers the main challenges of missile guidance and presents data acquired during the experimental study.Keywords: active canard control system, rocket design, numerical simulations, flight optimization
Procedia PDF Downloads 1972559 An Algorithm for Removal of Noise from X-Ray Images
Authors: Sajidullah Khan, Najeeb Ullah, Wang Yin Chai, Chai Soo See
Abstract:
In this paper, we propose an approach to remove impulse and Poisson noise from X-ray images. Many filters have been used for impulse noise removal from color and gray scale images with their own strengths and weaknesses but X-ray images contain Poisson noise and unfortunately there is no intelligent filter which can detect impulse and Poisson noise from X-ray images. Our proposed filter uses the upgraded layer discrimination approach to detect both Impulse and Poisson noise corrupted pixels in X-ray images and then restores only those detected pixels with a simple efficient and reliable one line equation. Our Proposed algorithms are very effective and much more efficient than all existing filters used only for Impulse noise removal. The proposed method uses a new powerful and efficient noise detection method to determine whether the pixel under observation is corrupted or noise free. Results from computer simulations are used to demonstrate pleasing performance of our proposed method.Keywords: X-ray image de-noising, impulse noise, poisson noise, PRWF
Procedia PDF Downloads 3862558 Intelligent Adaptive Learning in a Changing Environment
Authors: G. Valentis, Q. Berthelot
Abstract:
Nowadays the trend to develop ever more intelligent and autonomous systems often takes its inspiration in the living beings on Earth. Some simple isolated systems are able, once brought together, to form a strong and reliable system. When trying to adapt the idea to man-made systems it is not possible to include in their program everything the system may encounter during its life cycle. It is, thus, necessary to make the system able to take decisions based on other criteria such as its past experience, i.e. to make the system learn on its own. However, at some point the acquired knowledge depends also on environment. So the question is: if system environment is modified, how could the system respond to it quickly and appropriately enough? Here, starting from reinforcement learning to rate its decisions, and using adaptive learning algorithms for gain and loss reward, the system is made able to respond to changing environment and to adapt its knowledge as time passes. Application is made to a robot finding an exit in a labyrinth.Keywords: reinforcement learning, neural network, autonomous systems, adaptive learning, changing environment
Procedia PDF Downloads 4262557 Review of Transportation Modeling Software
Authors: Hassan M. Al-Ahmadi, Hamad Bader Almobayedh
Abstract:
Planning for urban transportation is essential for developing effective and sustainable transportation networks that meet the needs of various communities. Advanced modeling software is required for effective transportation planning, management, and optimization. This paper compares PTV VISUM, Aimsun, TransCAD, and Emme, four industry-leading software tools for transportation planning and modeling. Each software has strengths and limitations, and the project's needs, financial constraints, and level of technical expertise influence the choice of software. Transportation experts can design and improve urban transportation systems that are effective, sustainable, and meet the changing needs of their communities by utilizing these software tools.Keywords: PTV VISUM, Aimsun, TransCAD, transportation modeling software
Procedia PDF Downloads 372556 Photovoltaic Water Pumping System Application
Authors: Sarah Abdourraziq
Abstract:
Photovoltaic (PV) water pumping system is one of the most used and important applications in the field of solar energy. However, the cost and the efficiency are still a concern, especially with continued change of solar radiation and temperature. Then, the improvement of the efficiency of the system components is a good solution to reducing the cost. The use of maximum power point tracking (MPPT) algorithms to track the output maximum power point (MPP) of the PV panel is very important to improve the efficiency of the whole system. In this paper, we will present a definition of the functioning of MPPT technique, and a detailed model of each component of PV pumping system with Matlab-Simulink, the results shows the influence of the changing of solar radiation and temperature in the output characteristics of PV panel, which influence in the efficiency of the system. Our system consists of a PV generator, a boost converter, a motor-pump set, and storage tank.Keywords: PV panel, boost converter, MPPT, MPP, PV pumping system
Procedia PDF Downloads 4002555 Cardiovascular Disease Prediction Using Machine Learning Approaches
Abstract:
It is estimated that heart disease accounts for one in ten deaths worldwide. United States deaths due to heart disease are among the leading causes of death according to the World Health Organization. Cardiovascular diseases (CVDs) account for one in four U.S. deaths, according to the Centers for Disease Control and Prevention (CDC). According to statistics, women are more likely than men to die from heart disease as a result of strokes. A 50% increase in men's mortality was reported by the World Health Organization in 2009. The consequences of cardiovascular disease are severe. The causes of heart disease include diabetes, high blood pressure, high cholesterol, abnormal pulse rates, etc. Machine learning (ML) can be used to make predictions and decisions in the healthcare industry. Thus, scientists have turned to modern technologies like Machine Learning and Data Mining to predict diseases. The disease prediction is based on four algorithms. Compared to other boosts, the Ada boost is much more accurate.Keywords: heart disease, cardiovascular disease, coronary artery disease, feature selection, random forest, AdaBoost, SVM, decision tree
Procedia PDF Downloads 1592554 Solving 94-Bit ECDLP with 70 Computers in Parallel
Authors: Shunsuke Miyoshi, Yasuyuki Nogami, Takuya Kusaka, Nariyoshi Yamai
Abstract:
Elliptic curve discrete logarithm problem (ECDLP) is one of problems on which the security of pairing-based cryptography is based. This paper considers Pollard's rho method to evaluate the security of ECDLP on Barreto-Naehrig (BN) curve that is an efficient pairing-friendly curve. Some techniques are proposed to make the rho method efficient. Especially, the group structure on BN curve, distinguished point method, and Montgomery trick are well-known techniques. This paper applies these techniques and shows its optimization. According to the experimental results for which a large-scale parallel system with MySQL is applied, 94-bit ECDLP was solved about 28 hours by parallelizing 71 computers.Keywords: Pollard's rho method, BN curve, Montgomery multiplication
Procedia PDF Downloads 2752553 Simulation of Obstacle Avoidance for Multiple Autonomous Vehicles in a Dynamic Environment Using Q-Learning
Authors: Andreas D. Jansson
Abstract:
The availability of inexpensive, yet competent hardware allows for increased level of automation and self-optimization in the context of Industry 4.0. However, such agents require high quality information about their surroundings along with a robust strategy for collision avoidance, as they may cause expensive damage to equipment or other agents otherwise. Manually defining a strategy to cover all possibilities is both time-consuming and counter-productive given the capabilities of modern hardware. This paper explores the idea of a model-free self-optimizing obstacle avoidance strategy for multiple autonomous agents in a simulated dynamic environment using the Q-learning algorithm.Keywords: autonomous vehicles, industry 4.0, multi-agent system, obstacle avoidance, Q-learning, simulation
Procedia PDF Downloads 1412552 Modeling and Validation of Microspheres Generation in the Modified T-Junction Device
Authors: Lei Lei, Hongbo Zhang, Donald J. Bergstrom, Bing Zhang, K. Y. Song, W. J. Zhang
Abstract:
This paper presents a model for a modified T-junction device for microspheres generation. The numerical model is developed using a commercial software package: COMSOL Multiphysics. In order to test the accuracy of the numerical model, multiple variables, such as the flow rate of cross-flow, fluid properties, structure, and geometry of the microdevice are applied. The results from the model are compared with the experimental results in the diameter of the microsphere generated. The comparison shows a good agreement. Therefore the model is useful in further optimization of the device and feedback control of microsphere generation if any.Keywords: CFD modeling, validation, microsphere generation, modified T-junction
Procedia PDF Downloads 7102551 Day/Night Detector for Vehicle Tracking in Traffic Monitoring Systems
Authors: M. Taha, Hala H. Zayed, T. Nazmy, M. Khalifa
Abstract:
Recently, traffic monitoring has attracted the attention of computer vision researchers. Many algorithms have been developed to detect and track moving vehicles. In fact, vehicle tracking in daytime and in nighttime cannot be approached with the same techniques, due to the extreme different illumination conditions. Consequently, traffic-monitoring systems are in need of having a component to differentiate between daytime and nighttime scenes. In this paper, a HSV-based day/night detector is proposed for traffic monitoring scenes. The detector employs the hue-histogram and the value-histogram on the top half of the image frame. Experimental results show that the extraction of the brightness features along with the color features within the top region of the image is effective for classifying traffic scenes. In addition, the detector achieves high precision and recall rates along with it is feasible for real time applications.Keywords: day/night detector, daytime/nighttime classification, image classification, vehicle tracking, traffic monitoring
Procedia PDF Downloads 5612550 An Approximation Algorithm for the Non Orthogonal Cutting Problem
Abstract:
We study the problem of cutting a rectangular material entity into smaller sub-entities of trapezoidal forms with minimum waste of the material. This problem will be denoted TCP (Trapezoidal Cutting Problem). The TCP has many applications in manufacturing processes of various industries: pipe line design (petro chemistry), the design of airfoil (aeronautical) or cuts of the components of textile products. We introduce an orthogonal build to provide the optimal horizontal and vertical homogeneous strips. In this paper we develop a general heuristic search based upon orthogonal build. By solving two one-dimensional knapsack problems, we combine the horizontal and vertical homogeneous strips to give a non orthogonal cutting pattern.Keywords: combinatorial optimization, cutting problem, heuristic
Procedia PDF Downloads 5432549 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning
Authors: Walid Cherif
Abstract:
Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification
Procedia PDF Downloads 4682548 Simscape Library for Large-Signal Physical Network Modeling of Inertial Microelectromechanical Devices
Authors: S. Srinivasan, E. Cretu
Abstract:
The information flow (e.g. block-diagram or signal flow graph) paradigm for the design and simulation of Microelectromechanical (MEMS)-based systems allows to model MEMS devices using causal transfer functions easily, and interface them with electronic subsystems for fast system-level explorations of design alternatives and optimization. Nevertheless, the physical bi-directional coupling between different energy domains is not easily captured in causal signal flow modeling. Moreover, models of fundamental components acting as building blocks (e.g. gap-varying MEMS capacitor structures) depend not only on the component, but also on the specific excitation mode (e.g. voltage or charge-actuation). In contrast, the energy flow modeling paradigm in terms of generalized across-through variables offers an acausal perspective, separating clearly the physical model from the boundary conditions. This promotes reusability and the use of primitive physical models for assembling MEMS devices from primitive structures, based on the interconnection topology in generalized circuits. The physical modeling capabilities of Simscape have been used in the present work in order to develop a MEMS library containing parameterized fundamental building blocks (area and gap-varying MEMS capacitors, nonlinear springs, displacement stoppers, etc.) for the design, simulation and optimization of MEMS inertial sensors. The models capture both the nonlinear electromechanical interactions and geometrical nonlinearities and can be used for both small and large signal analyses, including the numerical computation of pull-in voltages (stability loss). Simscape behavioral modeling language was used for the implementation of reduced-order macro models, that present the advantage of a seamless interface with Simulink blocks, for creating hybrid information/energy flow system models. Test bench simulations of the library models compare favorably with both analytical results and with more in-depth finite element simulations performed in ANSYS. Separate MEMS-electronic integration tests were done on closed-loop MEMS accelerometers, where Simscape was used for modeling the MEMS device and Simulink for the electronic subsystem.Keywords: across-through variables, electromechanical coupling, energy flow, information flow, Matlab/Simulink, MEMS, nonlinear, pull-in instability, reduced order macro models, Simscape
Procedia PDF Downloads 1422547 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression
Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras
Abstract:
In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression
Procedia PDF Downloads 1252546 Flow Behavior and Performances of Centrifugal Compressor Stage Vaneless Diffusers
Authors: Y.Galerkin, O. Solovieva
Abstract:
Flow parameters are calculated in vaneless diffusers with relative width 0,014 – 0,10 constant along radii. Inlet flow angles and similarity criteria were varied. Information about flow structure is presented – meridian streamlines configuration, information on flow full development, flow separation. Polytrophic efficiency, loss and recovery coefficient are used to compare diffusers’ effectiveness. The sample of narrow diffuser optimization by conical walls application is presented. Three tampered variants of a wide diffuser are compared too. The work is made in the R&D laboratory “Gas dynamics of turbo machines” of the TU SPb.Keywords: vaneless diffuser, relative width, flow angle, flow separation, loss coefficient, similarity criteria
Procedia PDF Downloads 4922545 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System
Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin
Abstract:
RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.Keywords: cluster system, modular exponentiation, sliding window, addition chain
Procedia PDF Downloads 5282544 Optimization and Evaluation of 177lu-Dotatoc as a Potential Agent for Peptide Receptor Radionuclide Therapy
Authors: H. Yousefnia, MS. Mousavi-Daramoroudi, S. Zolghadri, F. Abbasi-Davani
Abstract:
High expression of somatostatin receptors on a wide range of human tumours makes them as potential targets for peptide receptor radionuclide tomography. A series of octreotide analogues were synthesized while [DOTA-DPhe1, Tyr3]octreotide (DOTATOC) indicated advantageous properties in tumour models. In this study, 177Lu-DOTATOC was prepared with the radiochemical purity of higher than 99% in 30 min at the optimized condition. Biological behavior of the complex was studied after intravenous injection into the Syrian rats. Major difference uptake was observed compared to 177LuCl3 solution especially in somatostatin receptor-positive tissues such as pancreas and adrenal.Keywords: Biodistribution, 177Lu, Octreotide, Syrian rats
Procedia PDF Downloads 4522543 Split Monotone Inclusion and Fixed Point Problems in Real Hilbert Spaces
Authors: Francis O. Nwawuru
Abstract:
The convergence analysis of split monotone inclusion problems and fixed point problems of certain nonlinear mappings are investigated in the setting of real Hilbert spaces. Inertial extrapolation term in the spirit of Polyak is incorporated to speed up the rate of convergence. Under standard assumptions, a strong convergence of the proposed algorithm is established without computing the resolvent operator or involving Yosida approximation method. The stepsize involved in the algorithm does not depend on the spectral radius of the linear operator. Furthermore, applications of the proposed algorithm in solving some related optimization problems are also considered. Our result complements and extends numerous results in the literature.Keywords: fixedpoint, hilbertspace, monotonemapping, resolventoperators
Procedia PDF Downloads 57