Search results for: sampling algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2018

Search results for: sampling algorithms

1298 GridNtru: High Performance PKCS

Authors: Narasimham Challa, Jayaram Pradhan

Abstract:

Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.

Keywords: Alchemi, GridNtru, Ntru, PKCS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
1297 A Computational Model for Resolving Pronominal Anaphora in Turkish Using Hobbs- Naïve Algorithm

Authors: Pınar Tüfekçi, Yılmaz Kılıçaslan

Abstract:

In this paper we present a computational model for pronominal anaphora resolution in Turkish. The model is based on Hobbs’ Naїve Algorithm [4, 5, 6], which exploits only the surface syntax of sentences in a given text.

Keywords: Anaphora Resolution, Pronoun Resolution, Syntax based Algorithms, Naїve Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3238
1296 Iterative Solutions to Some Linear Matrix Equations

Authors: Jiashang Jiang, Hao Liu, Yongxin Yuan

Abstract:

In this paper the gradient based iterative algorithms are presented to solve the following four types linear matrix equations: (a) AXB = F; (b) AXB = F, CXD = G; (c) AXB = F s. t. X = XT ; (d) AXB+CYD = F, where X and Y are unknown matrices, A,B,C,D, F,G are the given constant matrices. It is proved that if the equation considered has a solution, then the unique minimum norm solution can be obtained by choosing a special kind of initial matrices. The numerical results show that the proposed method is reliable and attractive.

Keywords: Matrix equation, iterative algorithm, parameter estimation, minimum norm solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856
1295 Modeling Language for Machine Learning

Authors: Tsuyoshi Okita, Tatsuya Niwa

Abstract:

For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem.

Keywords: Formal language, statistical inference problem, reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
1294 Advanced Technologies and Algorithms for Efficient Portfolio Selection

Authors: Konstantinos Liagkouras, Konstantinos Metaxiotis

Abstract:

In this paper we present a classification of the various technologies applied for the solution of the portfolio selection problem according to the discipline and the methodological framework followed. We provide a concise presentation of the emerged categories and we are trying to identify which methods considered obsolete and which lie at the heart of the debate. On top of that, we provide a comparative study of the different technologies applied for efficient portfolio construction and we suggest potential paths for future work that lie at the intersection of the presented techniques.

Keywords: Portfolio selection, optimization techniques, financial models, stochastics, heuristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
1293 ISTER (Immune System - Tumor Efficiency Rate): An Important Key for Planning in Radiotherapic Facilities

Authors: O. Sotolongo-Grau, D. Rodriguez-Perez, J. A. Santos-Miranda, M. M. Desco, O. Sotolongo-Costa, J. C. Antoranz

Abstract:

The use of the oncologic index ISTER allows for a more effective planning of the radiotherapic facilities in the hospitals. Any change in the radiotherapy treatment, due to unexpected stops, may be adapted by recalculating the doses to the new treatment duration while keeping the optimal prognosis. The results obtained in a simulation model on millions of patients allow the definition of optimal success probability algorithms.

Keywords: Mathematical model, radiation oncology, dynamical systems applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
1292 Using the Monte Carlo Simulation to Predict the Assembly Yield

Authors: C. Chahin, M. C. Hsu, Y. H. Lin, C. Y. Huang

Abstract:

Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.

Keywords: Monte Carlo simulation, placement yield, PCBcharacterization, electronics assembly

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2166
1291 Variable Step-Size APA with Decorrelation of AR Input Process

Authors: Jae Wook Shin, Ju-man Song, Hyun-Taek Choi, Poo Gyeon Park

Abstract:

This paper introduces a new variable step-size APA with decorrelation of AR input process is based on the MSD analysis. To achieve a fast convergence rate and a small steady-state estimation error, he proposed algorithm uses variable step size that is determined by minimising the MSD. In addition, experimental results show that the proposed algorithm is achieved better performance than the other algorithms.

Keywords: adaptive filter, affine projection algorithm, variable step size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
1290 Some Improvements on Kumlander-s Maximum Weight Clique Extraction Algorithm

Authors: Satoshi Shimizu, Kazuaki Yamaguchi, Toshiki Saitoh, Sumio Masuda

Abstract:

Some fast exact algorithms for the maximum weight clique problem have been proposed. Östergard’s algorithm is one of them. Kumlander says his algorithm is faster than it. But we confirmed that the straightforwardly implemented Kumlander’s algorithm is slower than O¨ sterga˚rd’s algorithm. We propose some improvements on Kumlander’s algorithm.

Keywords: Maximum weight clique, exact algorithm, branch-andbound, NP-hard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1858
1289 Variable Regularization Parameter Normalized Least Mean Square Adaptive Filter

Authors: Young-Seok Choi

Abstract:

We present a normalized LMS (NLMS) algorithm with robust regularization. Unlike conventional NLMS with the fixed regularization parameter, the proposed approach dynamically updates the regularization parameter. By exploiting a gradient descent direction, we derive a computationally efficient and robust update scheme for the regularization parameter. In simulation, we demonstrate the proposed algorithm outperforms conventional NLMS algorithms in terms of convergence rate and misadjustment error.

Keywords: Regularization, normalized LMS, system identification, robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876
1288 Computations of Bezier Geodesic-like Curves on Spheres

Authors: Sheng-Gwo Chen, Wen-Haw Chen

Abstract:

It is an important problem to compute the geodesics on a surface in many fields. To find the geodesics in practice, however, the traditional discrete algorithms or numerical approaches can only find a list of discrete points. The first author proposed in 2010 a new, elegant and accurate method, the geodesic-like method, for approximating geodesics on a regular surface. This paper will present by use of this method a computation of the Bezier geodesic-like curves on spheres.

Keywords: Geodesics, Geodesic-like curve, Spheres, Bezier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
1287 An Exact Solution to Support Vector Mixture

Authors: Monjed Ezzeddinne, Nicolas Lefebvre, Régis Lengellé

Abstract:

This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.

Keywords: Identification, Learning systems, Mixture ofExperts, Support Vector Machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
1286 A Comparative Study on Different Approaches to Evaluate Ship Equilibrium Point

Authors: Alessandro A. Zizzari, Francesca Calabrese, Giovanni Indiveri, Andrea Coraddu, Diego Villa

Abstract:

The aim of this paper is to present a comparative study on two different methods for the evaluation of the equilibrium point of a ship, core issue for designing an On Board Stability System (OBSS) module that, starting from geometry information of a ship hull, described by a discrete model in a standard format, and the distribution of all weights onboard calculates the ship floating conditions (in draught, heel and trim).

Keywords: Algorithms, Computer applications, Equilibrium, Marine applications, Stability System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2167
1285 Optimization of Inverse Kinematics of a 3R Robotic Manipulator using Genetic Algorithms

Authors: J. Ramírez A., A. Rubiano F.

Abstract:

In this paper the direct kinematic model of a multiple applications three degrees of freedom industrial manipulator, was developed using the homogeneous transformation matrices and the Denavit - Hartenberg parameters, likewise the inverse kinematic model was developed using the same method, verifying that in the workload border the inverse kinematic presents considerable errors, therefore a genetic algorithm was implemented to optimize the model improving greatly the efficiency of the model.

Keywords: Direct Kinematic, Genetic Algorithm, InverseKinematic, Optimization, Robot Manipulator

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3332
1284 Techniques Used in String Matching for Network Security

Authors: Jamuna Bhandari

Abstract:

String matching also known as pattern matching is one of primary concept for network security. In this area the effectiveness and efficiency of string matching algorithms is important for applications in network security such as network intrusion detection, virus detection, signature matching and web content filtering system. This paper presents brief review on some of string matching techniques used for network security.

Keywords: Filtering, honeypot, network telescope, pattern, string, signature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2701
1283 Unconditionally Secure Quantum Payment System

Authors: Essam Al-Daoud

Abstract:

A potentially serious problem with current payment systems is that their underlying hard problems from number theory may be solved by either a quantum computer or unanticipated future advances in algorithms and hardware. A new quantum payment system is proposed in this paper. The suggested system makes use of fundamental principles of quantum mechanics to ensure the unconditional security without prior arrangements between customers and vendors. More specifically, the new system uses Greenberger-Home-Zeilinger (GHZ) states and Quantum Key Distribution to authenticate the vendors and guarantee the transaction integrity.

Keywords: Bell state, GHZ state, Quantum key distribution, Quantum payment system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
1282 Soil Quality State and Trends in New Zealand’s Largest City after 15 Years

Authors: Fiona Curran-Cournane

Abstract:

Soil quality monitoring is a science-based soil management tool that assesses soil ecosystem health. A soil monitoring program in Auckland, New Zealand’s largest city extends from 1995 to the present. The objective of this study was to firstly determine changes in soil parameters (basic soil properties and heavy metals) that were assessed from rural land in 1995-2000 and repeated in 2008-2012. The second objective was to determine differences in soil parameters across various land uses including native bush, rural (horticulture, pasture and plantation forestry) and urban land uses using soil data collected in more recent years (2009- 2013). Across rural land, mean concentrations of Olsen P had significantly increased in the second sampling period and was identified as the indicator of most concern, followed by soil macroporosity, particularly for horticultural and pastoral land. Mean concentrations of Cd were also greatest for pastoral and horticultural land and a positive correlation existed between these two parameters, which highlights the importance of analysing basic soil parameters in conjunction with heavy metals. In contrast, mean concentrations of As, Cr, Pb, Ni and Zn were greatest for urban sites. Native bush sites had the lowest concentrations of heavy metals and were used to calculate a ‘pollution index’ (PI). The mean PI was classified as high (PI > 3) for Cd and Ni and moderate for Pb, Zn, Cr, Cu, As and Hg, indicating high levels of heavy metal pollution across both rural and urban soils. From a land use perspective, the mean ‘integrated pollution index’ was highest for urban sites at 2.9 followed by pasture, horticulture and plantation forests at 2.7, 2.6 and 0.9, respectively. It is recommended that soil sampling continues over time because a longer spanning record will allow further identification of where soil problems exist and where resources need to be targeted in the future. Findings from this study will also inform policy and science direction in regional councils.

Keywords: Heavy metals, Pollution Index, Rural and Urban land use.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2211
1281 A New Algorithm to Stereo Correspondence Using Rank Transform and Morphology Based On Genetic Algorithm

Authors: Razagh Hafezi, Ahmad Keshavarz, Vida Moshfegh

Abstract:

This paper presents a novel algorithm of stereo correspondence with rank transform. In this algorithm we used the genetic algorithm to achieve the accurate disparity map. Genetic algorithms are efficient search methods based on principles of population genetic, i.e. mating, chromosome crossover, gene mutation, and natural selection. Finally morphology is employed to remove the errors and discontinuities.

Keywords: genetic algorithm, morphology, rank transform, stereo correspondence

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173
1280 Moment Estimators of the Parameters of Zero-One Inflated Negative Binomial Distribution

Authors: Rafid Saeed Abdulrazak Alshkaki

Abstract:

In this paper, zero-one inflated negative binomial distribution is considered, along with some of its structural properties, then its parameters were estimated using the method of moments. It is found that the method of moments to estimate the parameters of the zero-one inflated negative binomial models is not a proper method and may give incorrect conclusions.

Keywords: Zero one inflated models, negative binomial distribution, moments estimator, non-negative integer sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1097
1279 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem

Authors: C. E. Nugraheni, L. Abednego

Abstract:

This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as metaheuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.

Keywords: Hyper-heuristics, evolutionary algorithms, production scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2413
1278 Financial Ethics: A Review of 2010 Flash Crash

Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid

Abstract:

Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.

Keywords: Flash Crash, Market Crash, Stock Market, Stock Market Crash.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857
1277 A Novel FFT-Based Frequency Offset Estimator for OFDM Systems

Authors: Mahdi Masoumi, Mehrdad Ardebilipoor, Seyed Aidin Bassam

Abstract:

This paper proposes a novel frequency offset (FO) estimator for orthogonal frequency division multiplexing. Simplicity is most significant feature of this algorithm and can be repeated to achieve acceptable accuracy. Also fractional and integer part of FO is estimated jointly with use of the same algorithm. To do so, instead of using conventional algorithms that usually use correlation function, we use DFT of received signal. Therefore, complexity will be reduced and we can do synchronization procedure by the same hardware that is used to demodulate OFDM symbol. Finally, computer simulation shows that the accuracy of this method is better than other conventional methods.

Keywords: DFT, Estimator, Frequency Offset, IEEE802.11a, OFDM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
1276 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: Algorithmic trading, automated investment system, DAX Deutscher Aktienindex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
1275 Using Genetic Algorithm to Improve Information Retrieval Systems

Authors: Ahmed A. A. Radwan, Bahgat A. Abdel Latef, Abdel Mgeid A. Ali, Osman A. Sadek

Abstract:

This study investigates the use of genetic algorithms in information retrieval. The method is shown to be applicable to three well-known documents collections, where more relevant documents are presented to users in the genetic modification. In this paper we present a new fitness function for approximate information retrieval which is very fast and very flexible, than cosine similarity fitness function.

Keywords: Cosine similarity, Fitness function, Genetic Algorithm, Information Retrieval, Query learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2756
1274 Performance Evaluation of Packet Scheduling with Channel Conditioning Aware Based On WiMAX Networks

Authors: Elmabruk Laias, Abdalla M. Hanashi, Mohammed Alnas

Abstract:

Worldwide Interoperability for Microwave Access (WiMAX) became one of the most challenging issues, since it was responsible for distributing available resources of the network among all users this leaded to the demand of constructing and designing high efficient scheduling algorithms in order to improve the network utilization, to increase the network throughput, and to minimize the end-to-end delay. In this study, the proposed algorithm focuses on an efficient mechanism to serve non_real time traffic in congested networks by considering channel status.

Keywords: WiMAX, Quality of Services (QoS), OPNE, Diff-Serv (DS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835
1273 The Role of Optimization and Machine Learning in e-Commerce Logistics in 2030

Authors: Vincenzo Capalbo, Gianpaolo Ghiani, Emanuele Manni

Abstract:

Global e-commerce sales have reached unprecedented levels in the past few years. As this trend is only predicted to go up as we continue into the ’20s, new challenges will be faced by companies when planning and controlling e-commerce logistics. In this paper, we survey the related literature on Optimization and Machine Learning as well as on combined methodologies. We also identify the distinctive features of next-generation planning algorithms - namely scalability, model-and-run features and learning capabilities - that will be fundamental to cope with the scale and complexity of logistics in the next decade.

Keywords: e-Commerce, Logistics, Machine Learning, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128
1272 Digital Image Watermarking in the Wavelet Transform Domain

Authors: Kamran Hameed, Adeel Mumtaz, S.A.M. Gilani

Abstract:

In this paper, we start by first characterizing the most important and distinguishing features of wavelet-based watermarking schemes. We studied the overwhelming amount of algorithms proposed in the literature. Application scenario, copyright protection is considered and building on the experience that was gained, implemented two distinguishing watermarking schemes. Detailed comparison and obtained results are presented and discussed. We concluded that Joo-s [1] technique is more robust for standard noise attacks than Dote-s [2] technique.

Keywords: Digital image, Copyright protection, Watermarking, Wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2652
1271 Effective Features for Disambiguation of Turkish Verbs

Authors: Zeynep Orhan, Zeynep Altan

Abstract:

This paper summarizes the results of some experiments for finding the effective features for disambiguation of Turkish verbs. Word sense disambiguation is a current area of investigation in which verbs have the dominant role. Generally verbs have more senses than the other types of words in the average and detecting these features for verbs may lead to some improvements for other word types. In this paper we have considered only the syntactical features that can be obtained from the corpus and tested by using some famous machine learning algorithms.

Keywords: Word sense disambiguation, feature selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
1270 Maximum Power Point Tracking Using FLC Tuned with GA

Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli

Abstract:

The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.

Keywords: Fuzzy logic controller (FLC), fuzzy logic (FL), genetic algorithm (GA), maximum power point (MPP), maximum power point tracking (MPPT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2625
1269 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Petar Penchev

Abstract:

The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.

Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22