Search results for: graph algorithms
937 Using the Monte Carlo Simulation to Predict the Assembly Yield
Authors: C. Chahin, M. C. Hsu, Y. H. Lin, C. Y. Huang
Abstract:
Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Keywords: Monte Carlo simulation, placement yield, PCBcharacterization, electronics assembly
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2166936 Variable Step-Size APA with Decorrelation of AR Input Process
Authors: Jae Wook Shin, Ju-man Song, Hyun-Taek Choi, Poo Gyeon Park
Abstract:
This paper introduces a new variable step-size APA with decorrelation of AR input process is based on the MSD analysis. To achieve a fast convergence rate and a small steady-state estimation error, he proposed algorithm uses variable step size that is determined by minimising the MSD. In addition, experimental results show that the proposed algorithm is achieved better performance than the other algorithms.
Keywords: adaptive filter, affine projection algorithm, variable step size.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1896935 Some Improvements on Kumlander-s Maximum Weight Clique Extraction Algorithm
Authors: Satoshi Shimizu, Kazuaki Yamaguchi, Toshiki Saitoh, Sumio Masuda
Abstract:
Some fast exact algorithms for the maximum weight clique problem have been proposed. Östergard’s algorithm is one of them. Kumlander says his algorithm is faster than it. But we confirmed that the straightforwardly implemented Kumlander’s algorithm is slower than O¨ sterga˚rd’s algorithm. We propose some improvements on Kumlander’s algorithm.
Keywords: Maximum weight clique, exact algorithm, branch-andbound, NP-hard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1858934 Inheritance of Primary Yield Component Traits of Common Beans (Phaseolus vulgaris L.): Number of Seeds per Pod and 1000 Seed Weight in an 8X8 Diallel Cross Population
Authors: Atnaf Tiruneh Mulugeta, Mohammed Ali Hussein, Zelleke Habtamu
Abstract:
Thirty six genotypes (8 parents and 28 F1 diallel crosses) were grown in randomized complete block design during 2006 at Mandura, North western Ethiopia. The experiment was executed to study the inheritance of two primary yield component traits: number of seeds per pod and 1000 seed weight. Statistical significant difference was observed between genotypes, parents, and crosses for these traits. The mean square due to GCA was significant for the two traits. However, SCA mean square was significant only for number of seeds per pod. Thus both additive and non-additive types of gene actions were important in the inheritance of number of seeds per pod. Significant b1 component was obtained for this trait. The b2 and b3 components, however, were not significant, suggesting the absence of gene asymmetry. From Wr/Vr graph, inheritance of seeds per pod was governed by partial dominance with additive gene action.
Keywords: Diallel crosses, General combining ability, Phaseolus vulgaris L., Specific combining ability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2493933 Variable Regularization Parameter Normalized Least Mean Square Adaptive Filter
Authors: Young-Seok Choi
Abstract:
We present a normalized LMS (NLMS) algorithm with robust regularization. Unlike conventional NLMS with the fixed regularization parameter, the proposed approach dynamically updates the regularization parameter. By exploiting a gradient descent direction, we derive a computationally efficient and robust update scheme for the regularization parameter. In simulation, we demonstrate the proposed algorithm outperforms conventional NLMS algorithms in terms of convergence rate and misadjustment error.Keywords: Regularization, normalized LMS, system identification, robustness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876932 Computations of Bezier Geodesic-like Curves on Spheres
Authors: Sheng-Gwo Chen, Wen-Haw Chen
Abstract:
It is an important problem to compute the geodesics on a surface in many fields. To find the geodesics in practice, however, the traditional discrete algorithms or numerical approaches can only find a list of discrete points. The first author proposed in 2010 a new, elegant and accurate method, the geodesic-like method, for approximating geodesics on a regular surface. This paper will present by use of this method a computation of the Bezier geodesic-like curves on spheres.Keywords: Geodesics, Geodesic-like curve, Spheres, Bezier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622931 Indoor Localization by Pattern Matching Method Based On Extended Database
Authors: Gyumin Hwang, Jihong Lee
Abstract:
This paper studied the CSS-based indoor localization system which is easy to implement, inexpensive to compose the systems, additionally CSS-based indoor localization system covers larger area than other system. However, this system has problem which is affected by reflected distance data. This problem in localization is caused by the multi-path effect. Error caused by multi-path is difficult to be corrected because the indoor environment cannot be described. In this paper, in order to solve the problem by multi-path, we have supplemented the localization system by using pattern matching method based on extended database. Thereby, this method improves precision of estimated. Also this method is verified by experiments in gymnasium. Database was constructed by 1m intervals, and 16 sample data were collected from random position inside the region of DB points. As a result, this paper shows higher accuracy than existing method through graph and table.
Keywords: Chirp Spread Spectrum (CSS), Indoor Localization, Pattern-Matching, Time of Arrival (ToA), Multi-Path, Mahalanobis Distance, Reception Rate, Simultaneous Localization and Mapping (SLAM), Laser Range Finder (LRF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1891930 An Exact Solution to Support Vector Mixture
Authors: Monjed Ezzeddinne, Nicolas Lefebvre, Régis Lengellé
Abstract:
This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.Keywords: Identification, Learning systems, Mixture ofExperts, Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365929 A Comparative Study on Different Approaches to Evaluate Ship Equilibrium Point
Authors: Alessandro A. Zizzari, Francesca Calabrese, Giovanni Indiveri, Andrea Coraddu, Diego Villa
Abstract:
The aim of this paper is to present a comparative study on two different methods for the evaluation of the equilibrium point of a ship, core issue for designing an On Board Stability System (OBSS) module that, starting from geometry information of a ship hull, described by a discrete model in a standard format, and the distribution of all weights onboard calculates the ship floating conditions (in draught, heel and trim).Keywords: Algorithms, Computer applications, Equilibrium, Marine applications, Stability System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2167928 Artificial Neural Network based Modeling of Evaporation Losses in Reservoirs
Authors: Surinder Deswal, Mahesh Pal
Abstract:
An Artificial Neural Network based modeling technique has been used to study the influence of different combinations of meteorological parameters on evaporation from a reservoir. The data set used is taken from an earlier reported study. Several input combination were tried so as to find out the importance of different input parameters in predicting the evaporation. The prediction accuracy of Artificial Neural Network has also been compared with the accuracy of linear regression for predicting evaporation. The comparison demonstrated superior performance of Artificial Neural Network over linear regression approach. The findings of the study also revealed the requirement of all input parameters considered together, instead of individual parameters taken one at a time as reported in earlier studies, in predicting the evaporation. The highest correlation coefficient (0.960) along with lowest root mean square error (0.865) was obtained with the input combination of air temperature, wind speed, sunshine hours and mean relative humidity. A graph between the actual and predicted values of evaporation suggests that most of the values lie within a scatter of ±15% with all input parameters. The findings of this study suggest the usefulness of ANN technique in predicting the evaporation losses from reservoirs.Keywords: Artificial neural network, evaporation losses, multiple linear regression, modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978927 Optimization of Inverse Kinematics of a 3R Robotic Manipulator using Genetic Algorithms
Authors: J. Ramírez A., A. Rubiano F.
Abstract:
In this paper the direct kinematic model of a multiple applications three degrees of freedom industrial manipulator, was developed using the homogeneous transformation matrices and the Denavit - Hartenberg parameters, likewise the inverse kinematic model was developed using the same method, verifying that in the workload border the inverse kinematic presents considerable errors, therefore a genetic algorithm was implemented to optimize the model improving greatly the efficiency of the model.Keywords: Direct Kinematic, Genetic Algorithm, InverseKinematic, Optimization, Robot Manipulator
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3332926 Techniques Used in String Matching for Network Security
Authors: Jamuna Bhandari
Abstract:
String matching also known as pattern matching is one of primary concept for network security. In this area the effectiveness and efficiency of string matching algorithms is important for applications in network security such as network intrusion detection, virus detection, signature matching and web content filtering system. This paper presents brief review on some of string matching techniques used for network security.
Keywords: Filtering, honeypot, network telescope, pattern, string, signature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2701925 Ontology-based Domain Modelling for Consistent Content Change Management
Authors: Muhammad Javed, Yalemisew M. Abgaz, Claus Pahl
Abstract:
Ontology-based modelling of multi-formatted software application content is a challenging area in content management. When the number of software content unit is huge and in continuous process of change, content change management is important. The management of content in this context requires targeted access and manipulation methods. We present a novel approach to deal with model-driven content-centric information systems and access to their content. At the core of our approach is an ontology-based semantic annotation technique for diversely formatted content that can improve the accuracy of access and systems evolution. Domain ontologies represent domain-specific concepts and conform to metamodels. Different ontologies - from application domain ontologies to software ontologies - capture and model the different properties and perspectives on a software content unit. Interdependencies between domain ontologies, the artifacts and the content are captured through a trace model. The annotation traces are formalised and a graph-based system is selected for the representation of the annotation traces.Keywords: Consistent Content Management, Impact Categorisation, Trace Model, Ontology Evolution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684924 Unconditionally Secure Quantum Payment System
Authors: Essam Al-Daoud
Abstract:
A potentially serious problem with current payment systems is that their underlying hard problems from number theory may be solved by either a quantum computer or unanticipated future advances in algorithms and hardware. A new quantum payment system is proposed in this paper. The suggested system makes use of fundamental principles of quantum mechanics to ensure the unconditional security without prior arrangements between customers and vendors. More specifically, the new system uses Greenberger-Home-Zeilinger (GHZ) states and Quantum Key Distribution to authenticate the vendors and guarantee the transaction integrity.
Keywords: Bell state, GHZ state, Quantum key distribution, Quantum payment system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550923 A New Algorithm to Stereo Correspondence Using Rank Transform and Morphology Based On Genetic Algorithm
Authors: Razagh Hafezi, Ahmad Keshavarz, Vida Moshfegh
Abstract:
This paper presents a novel algorithm of stereo correspondence with rank transform. In this algorithm we used the genetic algorithm to achieve the accurate disparity map. Genetic algorithms are efficient search methods based on principles of population genetic, i.e. mating, chromosome crossover, gene mutation, and natural selection. Finally morphology is employed to remove the errors and discontinuities.Keywords: genetic algorithm, morphology, rank transform, stereo correspondence
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173922 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem
Authors: C. E. Nugraheni, L. Abednego
Abstract:
This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as metaheuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.
Keywords: Hyper-heuristics, evolutionary algorithms, production scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2413921 Reliability Analysis of P-I Diagram Formula for RC Column Subjected to Blast Load
Authors: Masoud Abedini, Azrul A. Mutalib, Shahrizan Baharom, Hong Hao
Abstract:
This study was conducted published to investigate there liability of the equation pressure-impulse (PI) reinforced concrete column inprevious studies. Equation involves three different levels of damage criteria known as D =0. 2, D =0. 5 and D =0. 8.The damage criteria known as a minor when 0-0.2, 0.2-0.5is known as moderate damage, high damage known as 0.5-0.8, and 0.8-1 of the structure is considered a failure. In this study, two types of reliability analyzes conducted. First, using pressure-impulse equation with different parameters. The parameters involved are the concrete strength, depth, width, and height column, the ratio of longitudinal reinforcement and transverse reinforcement ratio. In the first analysis of the reliability of this new equation is derived to improve the previous equations. The second reliability analysis involves three types of columns used to derive the PI curve diagram using the derived equation to compare with the equation derived from other researchers and graph minimum standoff versus weapon yield Federal Emergency Management Agency (FEMA). The results showed that the derived equation is more accurate with FEMA standards than previous researchers.
Keywords: Blast load, RC column, P-I curve, Analytical formulae, Standard FEMA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2912920 Combining Bagging and Additive Regression
Authors: Sotiris B. Kotsiantis
Abstract:
Bagging and boosting are among the most popular re-sampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noise-free data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using an averaging methodology of bagging and boosting ensembles with 10 sub-learners in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-learners on standard benchmark datasets and the proposed ensemble gave better accuracy.
Keywords: Regressors, statistical learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640919 Financial Ethics: A Review of 2010 Flash Crash
Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid
Abstract:
Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.
Keywords: Flash Crash, Market Crash, Stock Market, Stock Market Crash.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857918 A Novel FFT-Based Frequency Offset Estimator for OFDM Systems
Authors: Mahdi Masoumi, Mehrdad Ardebilipoor, Seyed Aidin Bassam
Abstract:
This paper proposes a novel frequency offset (FO) estimator for orthogonal frequency division multiplexing. Simplicity is most significant feature of this algorithm and can be repeated to achieve acceptable accuracy. Also fractional and integer part of FO is estimated jointly with use of the same algorithm. To do so, instead of using conventional algorithms that usually use correlation function, we use DFT of received signal. Therefore, complexity will be reduced and we can do synchronization procedure by the same hardware that is used to demodulate OFDM symbol. Finally, computer simulation shows that the accuracy of this method is better than other conventional methods.
Keywords: DFT, Estimator, Frequency Offset, IEEE802.11a, OFDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497917 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market
Authors: Cristian Păuna
Abstract:
In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.
Keywords: Algorithmic trading, automated investment system, DAX Deutscher Aktienindex.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696916 Economy-Based Computing with WebCom
Authors: Adarsh Patil, David A. Power, John P. Morrison
Abstract:
Grid environments consist of the volatile integration of discrete heterogeneous resources. The notion of the Grid is to unite different users and organisations and pool their resources into one large computing platform where they can harness, inter-operate, collaborate and interact. If the Grid Community is to achieve this objective, then participants (Users and Organisations) need to be willing to donate or share their resources and permit other participants to use their resources. Resources do not have to be shared at all times, since it may result in users not having access to their own resource. The idea of reward-based computing was developed to address the sharing problem in a pragmatic manner. Participants are offered a reward to donate their resources to the Grid. A reward may include monetary recompense or a pro rata share of available resources when constrained. This latter point may imply a quality of service, which in turn may require some globally agreed reservation mechanism. This paper presents a platform for economybased computing using the WebCom Grid middleware. Using this middleware, participants can configure their resources at times and priority levels to suit their local usage policy. The WebCom system accounts for processing done on individual participants- resources and rewards them accordingly.Keywords: WebCom, Economy-based computing, WebComGrid Bank Reward, Condensed Graph, Distributor, Accounting, GridPoint.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1207915 Using Genetic Algorithm to Improve Information Retrieval Systems
Authors: Ahmed A. A. Radwan, Bahgat A. Abdel Latef, Abdel Mgeid A. Ali, Osman A. Sadek
Abstract:
This study investigates the use of genetic algorithms in information retrieval. The method is shown to be applicable to three well-known documents collections, where more relevant documents are presented to users in the genetic modification. In this paper we present a new fitness function for approximate information retrieval which is very fast and very flexible, than cosine similarity fitness function.Keywords: Cosine similarity, Fitness function, Genetic Algorithm, Information Retrieval, Query learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2756914 Performance Evaluation of Packet Scheduling with Channel Conditioning Aware Based On WiMAX Networks
Authors: Elmabruk Laias, Abdalla M. Hanashi, Mohammed Alnas
Abstract:
Worldwide Interoperability for Microwave Access (WiMAX) became one of the most challenging issues, since it was responsible for distributing available resources of the network among all users this leaded to the demand of constructing and designing high efficient scheduling algorithms in order to improve the network utilization, to increase the network throughput, and to minimize the end-to-end delay. In this study, the proposed algorithm focuses on an efficient mechanism to serve non_real time traffic in congested networks by considering channel status.
Keywords: WiMAX, Quality of Services (QoS), OPNE, Diff-Serv (DS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835913 The Role of Optimization and Machine Learning in e-Commerce Logistics in 2030
Authors: Vincenzo Capalbo, Gianpaolo Ghiani, Emanuele Manni
Abstract:
Global e-commerce sales have reached unprecedented levels in the past few years. As this trend is only predicted to go up as we continue into the ’20s, new challenges will be faced by companies when planning and controlling e-commerce logistics. In this paper, we survey the related literature on Optimization and Machine Learning as well as on combined methodologies. We also identify the distinctive features of next-generation planning algorithms - namely scalability, model-and-run features and learning capabilities - that will be fundamental to cope with the scale and complexity of logistics in the next decade.Keywords: e-Commerce, Logistics, Machine Learning, Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1127912 Digital Image Watermarking in the Wavelet Transform Domain
Authors: Kamran Hameed, Adeel Mumtaz, S.A.M. Gilani
Abstract:
In this paper, we start by first characterizing the most important and distinguishing features of wavelet-based watermarking schemes. We studied the overwhelming amount of algorithms proposed in the literature. Application scenario, copyright protection is considered and building on the experience that was gained, implemented two distinguishing watermarking schemes. Detailed comparison and obtained results are presented and discussed. We concluded that Joo-s [1] technique is more robust for standard noise attacks than Dote-s [2] technique.Keywords: Digital image, Copyright protection, Watermarking, Wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2652911 Effective Features for Disambiguation of Turkish Verbs
Authors: Zeynep Orhan, Zeynep Altan
Abstract:
This paper summarizes the results of some experiments for finding the effective features for disambiguation of Turkish verbs. Word sense disambiguation is a current area of investigation in which verbs have the dominant role. Generally verbs have more senses than the other types of words in the average and detecting these features for verbs may lead to some improvements for other word types. In this paper we have considered only the syntactical features that can be obtained from the corpus and tested by using some famous machine learning algorithms.
Keywords: Word sense disambiguation, feature selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747910 Maximum Power Point Tracking Using FLC Tuned with GA
Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli
Abstract:
The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.
Keywords: Fuzzy logic controller (FLC), fuzzy logic (FL), genetic algorithm (GA), maximum power point (MPP), maximum power point tracking (MPPT).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2625909 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Petar Penchev
Abstract:
The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.
Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17908 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications
Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso
Abstract:
The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.
Keywords: Interferometry, MIMO RADAR, SAR, tomography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911