Search results for: predicting models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2800

Search results for: predicting models

2050 Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations

Authors: Yehjune Heo

Abstract:

Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.

Keywords: Anti-spoofing, CNN, fingerprint recognition, GAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 594
2049 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies  the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: Retail stores, Faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 587
2048 Complex-Valued Neural Networks for Blind Equalization of Time-Varying Channels

Authors: Rajoo Pandey

Abstract:

Most of the commonly used blind equalization algorithms are based on the minimization of a nonconvex and nonlinear cost function and a neural network gives smaller residual error as compared to a linear structure. The efficacy of complex valued feedforward neural networks for blind equalization of linear and nonlinear communication channels has been confirmed by many studies. In this paper we present two neural network models for blind equalization of time-varying channels, for M-ary QAM and PSK signals. The complex valued activation functions, suitable for these signal constellations in time-varying environment, are introduced and the learning algorithms based on the CMA cost function are derived. The improved performance of the proposed models is confirmed through computer simulations.

Keywords: Blind Equalization, Neural Networks, Constant Modulus Algorithm, Time-varying channels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893
2047 Fuzzy Control of Macroeconomic Models

Authors: Andre A. Keller

Abstract:

The optimal control is one of the possible controllers for a dynamic system, having a linear quadratic regulator and using the Pontryagin-s principle or the dynamic programming method . Stochastic disturbances may affect the coefficients (multiplicative disturbances) or the equations (additive disturbances), provided that the shocks are not too great . Nevertheless, this approach encounters difficulties when uncertainties are very important or when the probability calculus is of no help with very imprecise data. The fuzzy logic contributes to a pragmatic solution of such a problem since it operates on fuzzy numbers. A fuzzy controller acts as an artificial decision maker that operates in a closed-loop system in real time. This contribution seeks to explore the tracking problem and control of dynamic macroeconomic models using a fuzzy learning algorithm. A two inputs - single output (TISO) fuzzy model is applied to the linear fluctuation model of Phillips and to the nonlinear growth model of Goodwin.

Keywords: fuzzy control, macroeconomic model, multiplier - accelerator, nonlinear accelerator, stabilization policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
2046 A Scatter Search and Help Policies Approaches for a New Mixed Model Assembly Lines Sequencing Problem

Authors: N. Manavizadeh , M. Rabbani , H. Sotudian , F. Jolai

Abstract:

Mixed Model Production is the practice of assembling several distinct and different models of a product on the same assembly line without changeovers and then sequencing those models in a way that smoothes the demand for upstream components. In this paper, we consider an objective function which minimizes total stoppage time and total idle time and it is presented sequence dependent set up time. Many studies have been done on the mixed model assembly lines. But in this paper we specifically focused on reducing the idle times. This is possible through various help policies. For improving the solutions, some cases developed and about 40 tests problem was considered. We use scatter search for optimization and for showing the efficiency of our algorithm, experimental results shows behavior of method. Scatter search and help policies can produce high quality answers, so it has been used in this paper.

Keywords: mixed model assembly lines, Scatter search, help policies, idle time, Stoppage time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
2045 Inverse Heat Transfer Analysis of a Melting Furnace Using Levenberg-Marquardt Method

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents a simple inverse heat transfer procedure for predicting the wall erosion and the time-varying thickness of the protective bank that covers the inside surface of the refractory brick wall of a melting furnace. The direct problem is solved by using the Finite-Volume model. The melting/solidification process is modeled using the enthalpy method. The inverse procedure rests on the Levenberg-Marquardt method combined with the Broyden method. The effect of the location of the temperature sensors and of the measurement noise on the inverse predictions is investigated. Recommendations are made concerning the location of the temperature sensor.

Keywords: Melting furnace, inverse heat transfer, enthalpy method, Levenberg–Marquardt Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1320
2044 Equilibrium and Rate Based Simulation of MTBE Reactive Distillation Column

Authors: Debashish Panda, Kannan A.

Abstract:

Equilibrium and rate based models have been applied in the simulation of methyl tertiary-butyl ether (MTBE) synthesis through reactive distillation. Temperature and composition profiles were compared for both the models and found that both the profiles trends, though qualitatively similar are significantly different quantitatively. In the rate based method (RBM), multicomponent mass transfer coefficients have been incorporated to describe interphase mass transfer. MTBE mole fraction in the bottom stream is found to be 0.9914 in the Equilibrium Model (EQM) and only 0.9904 for RBM when the same column configuration was preserved. The individual tray efficiencies were incorporated in the EQM and simulations were carried out. Dynamic simulation have been also carried out for the two column configurations and compared.

Keywords: Aspen Plus, equilibrium stage model, methyl tertiary-butyl ether, rate based model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4915
2043 A Control Model for the Dismantling of Industrial Plants

Authors: Florian Mach, Eric Hund, Malte Stonis

Abstract:

The dismantling of disused industrial facilities such as nuclear power plants or refineries is an enormous challenge for the planning and control of the logistic processes. Existing control models do not meet the requirements for a proper dismantling of industrial plants. Therefore, the paper presents an approach for the control of dismantling and post-processing processes (e.g. decontamination) in plant decommissioning. In contrast to existing approaches, the dismantling sequence and depth are selected depending on the capacity utilization of required post-processing processes by also considering individual characteristics of respective dismantling tasks (e.g. decontamination success rate, uncertainties regarding the process times). The results can be used in the dismantling of industrial plants (e.g. nuclear power plants) to reduce dismantling time and costs by avoiding bottlenecks such as capacity constraints.

Keywords: Dismantling management, logistics planning and control models, nuclear power plant dismantling, reverse logistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
2042 Therapeutic Product Preparation Bioprocess Modeling

Authors: Mihai Caramihai, Irina Severin, Ana Aurelia Chirvase, Adrian Onu, Cristina Tanase, Camelia Ungureanu

Abstract:

An immunomodulator bioproduct is prepared in a batch bioprocess with a modified bacterium Pseudomonas aeruginosa. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The optimal bioprocess parameters were determined: temperature – 37 0C, agitation speed - 300 rpm, aeration rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max. 4 % of the medium volume, duration - 6 hours. This kind of bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying. The aim of the paper is to present (by comparison) different models based on experimental data. The analysis criteria were modeling error and convergence rate. The estimated values and the modeling analysis were done by using the Table Curve 2D. The preliminary conclusions indicate Andrews-s model with a maximum specific growth rate of the bacterium in the range of 0.8 h-1.

Keywords: bioprocess modeling, Pseudomonas aeruginosa, kinetic models,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
2041 Investigation of the Neutral Axis in the Positive Moment Region of Composite Beams

Authors: Su-Young Jeong, Won-Kee Hong, Seon-Chee Park, Gyun-Taek Lim, Eric Kim

Abstract:

Researchers investigate arious strategies to develop composite beams and maximize the structural advantages. This study attempted to conduct experiments and analysis of changes in the neutral axis of positive moments of a Green Beam. Strain compatibility analysis was used, and its efficiency was demonstrated by comparing experimental and analytical values. In the comparison of neutral axis, the difference between experimental and analytical values was found to range from 8.8~26.2%. It was determined that strain compatibility analysis can be useful for predicting the behaviors of composite beams, with the ability to predict the behavior of not only the elastic location of the composite member, but also of the plastic location

Keywords: Composite beam, Strain compatibility, Neutral axis, Green Beam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2126
2040 Injury Prediction for Soccer Players Using Machine Learning

Authors: Amiel Satvedi, Richard Pyne

Abstract:

Injuries in professional sports occur on a regular basis. Some may be minor while others can cause huge impact on a player’s career and earning potential. In soccer, there is a high risk of players picking up injuries during game time. This research work seeks to help soccer players reduce the risk of getting injured by predicting the likelihood of injury while playing in the near future and then providing recommendations for intervention. The injury prediction tool will use a soccer player’s number of minutes played on the field, number of appearances, distance covered and performance data for the current and previous seasons as variables to conduct statistical analysis and provide injury predictive results using a machine learning linear regression model.

Keywords: Injury predictor, soccer injury prevention, machine learning in soccer, big data in soccer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
2039 Boosting Method for Automated Feature Space Discovery in Supervised Quantum Machine Learning Models

Authors: Vladimir Rastunkov, Jae-Eun Park, Abhijit Mitra, Brian Quanz, Steve Wood, Christopher Codella, Heather Higgins, Joseph Broz

Abstract:

Quantum Support Vector Machines (QSVM) have become an important tool in research and applications of quantum kernel methods. In this work we propose a boosting approach for building ensembles of QSVM models and assess performance improvement across multiple datasets. This approach is derived from the best ensemble building practices that worked well in traditional machine learning and thus should push the limits of quantum model performance even further. We find that in some cases, a single QSVM model with tuned hyperparameters is sufficient to simulate the data, while in others - an ensemble of QSVMs that are forced to do exploration of the feature space via proposed method is beneficial.

Keywords: QSVM, Quantum Support Vector Machines, quantum kernel, boosting, ensemble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 441
2038 An Examination of the Factors Influencing Software Development Effort

Authors: Zhizhong Jiang, Peter Naudé

Abstract:

Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.

Keywords: Development effort, function points, team size, development language, CASE tool, rapid application development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2509
2037 On the Mathematical Structure and Algorithmic Implementation of Biochemical Network Models

Authors: Paola Lecca

Abstract:

Modeling and simulation of biochemical reactions is of great interest in the context of system biology. The central dogma of this re-emerging area states that it is system dynamics and organizing principles of complex biological phenomena that give rise to functioning and function of cells. Cell functions, such as growth, division, differentiation and apoptosis are temporal processes, that can be understood if they are treated as dynamic systems. System biology focuses on an understanding of functional activity from a system-wide perspective and, consequently, it is defined by two hey questions: (i) how do the components within a cell interact, so as to bring about its structure and functioning? (ii) How do cells interact, so as to develop and maintain higher levels of organization and functions? In recent years, wet-lab biologists embraced mathematical modeling and simulation as two essential means toward answering the above questions. The credo of dynamics system theory is that the behavior of a biological system is given by the temporal evolution of its state. Our understanding of the time behavior of a biological system can be measured by the extent to which a simulation mimics the real behavior of that system. Deviations of a simulation indicate either limitations or errors in our knowledge. The aim of this paper is to summarize and review the main conceptual frameworks in which models of biochemical networks can be developed. In particular, we review the stochastic molecular modelling approaches, by reporting the principal conceptualizations suggested by A. A. Markov, P. Langevin, A. Fokker, M. Planck, D. T. Gillespie, N. G. van Kampfen, and recently by D. Wilkinson, O. Wolkenhauer, P. S. Jöberg and by the author.

Keywords: Mathematical structure, algorithmic implementation, biochemical network models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1560
2036 Appraisal of Methods for Identifying, Mapping, and Modelling of Fluvial Erosion in a Mining Environment

Authors: F. F. Howard, I. Yakubu, C. B. Boye, J. S. Y. Kuma

Abstract:

Natural and human activities, such as mining operations, expose the natural soil to adverse environmental conditions, leading to contamination of soil, groundwater, and surface water, which has negative effects on humans, flora, and fauna. Bare or partly exposed soil is most liable to fluvial erosion. This paper enumerates various methods used to identify, map, and model fluvial erosion in a mining environment. Classical, Artificial Intelligence (AI), and GIS methods have been reviewed. One of the many classical methods used to estimate river erosion is the Revised Universal Soil Loss Equation (RUSLE) model. The RUSLE model is easy to use. Its reliance on empirical relationships that may not always be applicable to specific circumstances or locations is a flaw. Other classical models for estimating fluvial erosion are the Soil and Water Assessment Tool (SWAT) and the Universal Soil Loss Equation (USLE). These models offer a more complete understanding of the underlying physical processes and encompass a wider range of situations. Although more difficult to utilise, they depend on the availability and dependability of input data for correctness. AI can help deal with multivariate and complex difficulties and predict soil loss with higher accuracy than traditional methods, and also be used to build unique models for identifying degraded areas. AI techniques have become popular as an alternative predictor for degraded environments. However, this research proposed a hybrid of classical, AI, and GIS methods for efficient and effective modelling of fluvial erosion.

Keywords: Fluvial erosion, classical methods, Artificial Intelligence, Geographic Information System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189
2035 Convection through Light Weight Timber Constructions with Mineral Wool

Authors: J. Schmidt, O. Kornadt

Abstract:

The major part of light weight timber constructions consists of insulation. Mineral wool is the most commonly used insulation due to its cost efficiency and easy handling. The fiber orientation and porosity of this insulation material enables flowthrough. The air flow resistance is low. If leakage occurs in the insulated bay section, the convective flow may cause energy losses and infiltration of the exterior wall with moisture and particles. In particular the infiltrated moisture may lead to thermal bridges and growth of health endangering mould and mildew. In order to prevent this problem, different numerical calculation models have been developed. All models developed so far have a potential for completion. The implementation of the flow-through properties of mineral wool insulation may help to improve the existing models. Assuming that the real pressure difference between interior and exterior surface is larger than the prescribed pressure difference in the standard test procedure for mineral wool ISO 9053 / EN 29053, measurements were performed using the measurement setup for research on convective moisture transfer “MSRCMT". These measurements show, that structural inhomogeneities of mineral wool effect the permeability only at higher pressure differences, as applied in MSRCMT. Additional microscopic investigations show, that the location of a leak within the construction has a crucial influence on the air flow-through and the infiltration rate. The results clearly indicate that the empirical values for the acoustic resistance of mineral wool should not be used for the calculation of convective transfer mechanisms.

Keywords: convection, convective transfer, infiltration, mineralwool, permeability, resistance, leakage

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
2034 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Petar Penchev

Abstract:

The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.

Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42
2033 A Study on the Secure ebXML Transaction Models

Authors: Dongkyoo Shin, Dongil Shin, Sukil Cha, Seyoung Kim

Abstract:

ebXML (Electronic Business using eXtensible Markup Language) is an e-business standard, sponsored by UN/CEFACT and OASIS, which enables enterprises to exchange business messages, conduct trading relationships, communicate data in common terms and define and register business processes. While there is tremendous e-business value in the ebXML, security remains an unsolved problem and one of the largest barriers to adoption. XML security technologies emerging recently have extensibility and flexibility suitable for security implementation such as encryption, digital signature, access control and authentication. In this paper, we propose ebXML business transaction models that allow trading partners to securely exchange XML based business transactions by employing XML security technologies. We show how each XML security technology meets the ebXML standard by constructing the test software and validating messages between the trading partners.

Keywords: Electronic commerce, e-business standard, ebXML, XML security, secure business transaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
2032 Instructional Design Practitioners in Malaysia: Skills and Issues

Authors: Irfan N. Umar, Yong Su-Lyn

Abstract:

The purpose of this research is to determine the knowledge and skills possessed by instructional design (ID) practitioners in Malaysia. As ID is a relatively new field in the country and there seems to be an absence of any studies on its community of practice, the main objective of this research is to discover the tasks and activities performed by ID practitioners in educational and corporate organizations as suggested by the International Board of Standards for Training, Performance and Instruction. This includes finding out the ID models applied in the course of their work. This research also attempts to identify the barriers and issues as to why some ID tasks and activities are rarely or never conducted. The methodology employed in this descriptive study was a survey questionnaire sent to 30 instructional designers nationwide. The results showed that majority of the tasks and activities are carried out frequently enough but omissions do occur due to reasons such as it being out of job scope, the decision was already made at a higher level, and the lack of knowledge and skills. Further investigations of a qualitative manner should be conducted to achieve a more in-depth understanding of ID practices in Malaysia

Keywords: instructional design, ID competencies, ID models, IBSTPI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
2031 Data Envelopment Analysis under Uncertainty and Risk

Authors: P. Beraldi, M. E. Bruni

Abstract:

Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.

Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970
2030 Image Processing Using Color and Object Information for Wireless Capsule Endoscopy

Authors: Jin-Hee Park, Yong-Gyu Lee, Gilwon Yoon

Abstract:

Wireless capsule endoscopy provides real-time images in the digestive tract. Capsule images are usually low resolution and are diverse images due to travel through various regions of human body. Color information has been a primary reference in predicting abnormalities such as bleeding. Often color is not sufficient for this purpose. In this study, we took morphological shapes into account as additional, but important criterion. First, we processed gastric images in order to indentify various objects in the image. Then, we analyzed color information in the object. In this way, we could remove unnecessary information and increase the accuracy. Compared to our previous investigations, we could handle images of various degrees of brightness and improve our diagnostic algorithm.

Keywords: Capsule Endoscopy, HSV model, Image processing, Object Identification, Color Separation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2058
2029 Optimization of Electromagnetic Interference Measurement by Convolutional Neural Network

Authors: Hussam Elias, Ninovic Perez, Holger Hirsch

Abstract:

With ever-increasing use of equipment, device or more generally any electrical or electronic system, the chance of Electromagnetic incompatibility incidents has considerably increased which demands more attention to ensure the possible risks of these technologies. Therefore, complying with certain Electromagnetic compatibility (EMC) rules and not overtaking an acceptable level of radiated emissions are utmost importance for the diffusion of electronic products. In this paper, developed measure tool and a convolutional neural network were used to propose a method to reduce the required time to carry out the final measurement phase of Electromagnetic interference (EMI) measurement according to the norm EN 55032 by predicting the radiated emission and determining the height of the antenna that meets the maximum radiation value.

Keywords: Antenna height, Convolutional Neural Network, Electromagnetic Compatibility, Mean Absolute Error, position error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154
2028 Optimum Turbomachine Selection for Power Regeneration in Vapor Compression Cool Production Plants

Authors: S. B. Alavi, G. Cerri, L. Chennaoui, A. Giovannelli, S. Mazzoni

Abstract:

Power Regeneration in Refrigeration Plant concept has been analyzed and has been shown to be capable of saving about 25% power in Cryogenic Plants with the Power Regeneration System (PRS) running under nominal conditions. The innovative component Compressor Expander Group (CEG) based on turbomachinery has been designed and built modifying CETT compressor and expander, both selected for optimum plant performance. Experiments have shown the good response of the turbomachines to run with R404a as working fluid. Power saving up to 12% under PRS derated conditions (50% loading) has been demonstrated. Such experiments allowed predicting a power saving up to 25% under CEG full load.

Keywords: Compressor, Expander, Power Saving, Refrigeration Plant, Turbine, Turbomachinery Selection, Vapor Pressure Booster.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2323
2027 WPRiMA Tool: Managing Risks in Web Projects

Authors: Thamer Al-Rousan, Shahida Sulaiman, Rosalina Abdul Salam

Abstract:

Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.

Keywords: Architecture pattern model, risk factors, risk identification, web project, web project risk management assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594
2026 Identification of the Key Sustainability Issues to Develop New Decision Support Tools in the Spanish Furniture Sector

Authors: P.Cordero, R.Poler, R.Sanchis

Abstract:

The environmental impacts caused by the current production and consumption models, together with the impact that the current economic crisis, bring necessary changes in the European industry toward new business models based on sustainability issues that could allow them to innovate and improve their competitiveness. This paper analyzes the key environmental issues and the current and future market trends in one of the most important industrial sectors in Spain, the furniture sector. It also proposes new decision support tools -diagnostic kit, roadmap and guidelines- to guide companies to implement sustainability criteria into their organizations, including eco-design strategies and other economical and social strategies in accordance with the sustainability definition, and other available tools such as eco-labels, environmental management systems, etc., and to use and combine them to obtain the results the company expects to help improve its competitiveness.

Keywords: Furniture sector, eco-design, sustainability, economical crisis, market trends, roadmap

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
2025 Performance Prediction of Multi-Agent Based Simulation Applications on the Grid

Authors: Dawit Mengistu, Lars Lundberg, Paul Davidsson

Abstract:

A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.

Keywords: Grid computing, Performance modeling, Performance prediction, Multi-agent simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449
2024 Modeling Biology Inspired Reactive Agents Using X-machines

Authors: George Eleftherakis, Petros Kefalas, Anna Sotiriadou, Evangelos Kehris

Abstract:

Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.

Keywords: Biology inspired agent, formal methods, x-machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508
2023 Modelling and Analysis of a Robust Control of Manufacturing Systems: Flow-Quality Approach

Authors: Lotfi Nabli, Achraf Jabeur Telmoudi, Radhi M'hiri

Abstract:

This paper proposes a modeling method of the laws controlling manufacturing systems with temporal and non temporal constraints. A methodology of robust control construction generating the margins of passive and active robustness is being elaborated. Indeed, two paramount models are presented in this paper. The first utilizes the P-time Petri Nets which is used to manage the flow type disturbances. The second, the quality model, exploits the Intervals Constrained Petri Nets (ICPN) tool which allows the system to preserve its quality specificities. The redundancy of the robustness of the elementary parameters between passive and active is also used. The final model built allows the correlation of temporal and non temporal criteria by putting two paramount models in interaction. To do so, a set of definitions and theorems are employed and affirmed by applicator examples.

Keywords: Manufacturing systems control, flow, quality, robustness, redundancy, Petri Nets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
2022 Data-Driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: Startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 831
2021 Software Maintenance Severity Prediction with Soft Computing Approach

Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.

Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583