Search results for: quantum algorithms
1121 Multi-Criteria Decision Making Approaches for Facility Planning Problem Evaluation: A Survey
Authors: Ahmed M. El-Araby, Ibrahim Sabry, Ahmed El-Assal
Abstract:
The relationships between the industrial facilities, the capacity available for these facilities, and the costs involved are the main factors in deciding the correct selection of a facility layout. In general, an issue of facility layout is considered to be an unstructured problem of decision-making. The objective of this work is to provide a survey that describes the techniques by which a facility planning problem can be solved and also the effect of these techniques on the efficiency of the layout. The multi-criteria decision making (MCDM) techniques can be classified according to the previous researches into three categories which are the use of single MCDM, combining two or more MCDM, and the integration of MCDM with another technique such as genetic algorithms (GA). This paper presents a review of different multi-criteria decision making (MCDM) techniques that have been proposed in the literature to pick the most suitable layout design. These methods are particularly suitable to deal with complex situations, including various criteria and conflicting goals which need to be optimized simultaneously.Keywords: facility layout, MCDM, GA, literature review
Procedia PDF Downloads 2051120 Unsteady 3D Post-Stall Aerodynamics Accounting for Effective Loss in Camber Due to Flow Separation
Authors: Aritras Roy, Rinku Mukherjee
Abstract:
The current study couples a quasi-steady Vortex Lattice Method and a camber correcting technique, ‘Decambering’ for unsteady post-stall flow prediction. The wake is force-free and discrete such that the wake lattices move with the free-stream once shed from the wing. It is observed that the time-averaged unsteady coefficient of lift sees a relative drop at post-stall angles of attack in comparison to its steady counterpart for some angles of attack. Multiple solutions occur at post-stall and three different algorithms to choose solutions in these regimes show both unsteadiness and non-convergence of the iterations. The distribution of coefficient of lift on the wing span also shows sawtooth. Distribution of vorticity changes both along span and in the direction of the free-stream as the wake develops over time with distinct roll-up, which increases with time.Keywords: post-stall, unsteady, wing, aerodynamics
Procedia PDF Downloads 3701119 Design of Non-uniform Circular Antenna Arrays Using Firefly Algorithm for Side Lobe Level Reduction
Authors: Gopi Ram, Durbadal Mandal, Rajib Kar, Sakti Prasad Ghoshal
Abstract:
A design problem of non-uniform circular antenna arrays for maximum reduction of both the side lobe level (SLL) and first null beam width (FNBW) is dealt with. This problem is modeled as a simple optimization problem. The method of Firefly algorithm (FFA) is used to determine an optimal set of current excitation weights and antenna inter-element separations that provide radiation pattern with maximum SLL reduction and much improvement on FNBW as well. Circular array antenna laid on x-y plane is assumed. FFA is applied on circular arrays of 8-, 10-, and 12- elements. Various simulation results are presented and hence performances of side lobe and FNBW are analyzed. Experimental results show considerable reductions of both the SLL and FNBW with respect to those of the uniform case and some standard algorithms GA, PSO, and SA applied to the same problem.Keywords: circular arrays, first null beam width, side lobe level, FFA
Procedia PDF Downloads 2591118 Quantitative Analysis of Multiprocessor Architectures for Radar Signal Processing
Authors: Deepak Kumar, Debasish Deb, Reena Mamgain
Abstract:
Radar signal processing requires high number crunching capability. Most often this is achieved using multiprocessor platform. Though multiprocessor platform provides the capability of meeting the real time computational challenges, the architecture of the same along with mapping of the algorithm on the architecture plays a vital role in efficiently using the platform. Towards this, along with standard performance metrics, few additional metrics are defined which helps in evaluating the multiprocessor platform along with the algorithm mapping. A generic multiprocessor architecture can not suit all the processing requirements. Depending on the system requirement and type of algorithms used, the most suitable architecture for the given problem is decided. In the paper, we study different architectures and quantify the different performance metrics which enables comparison of different architectures for their merit. We also carried out case study of different architectures and their efficiency depending on parallelism exploited on algorithm or data or both.Keywords: radar signal processing, multiprocessor architecture, efficiency, load imbalance, buffer requirement, pipeline, parallel, hybrid, cluster of processors (COPs)
Procedia PDF Downloads 4121117 Time Series Regression with Meta-Clusters
Authors: Monika Chuchro
Abstract:
This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain a subgroups of time series data with normal distribution from inflow into waste water treatment plant data which Composed of several groups differing by mean value. Two simple algorithms: K-mean and EM were chosen as a clustering method. The rand index was used to measure the similarity. After simple meta-clustering, regression model was performed for each subgroups. The final model was a sum of subgroups models. The quality of obtained model was compared with the regression model made using the same explanatory variables but with no clustering of data. Results were compared by determination coefficient (R2), measure of prediction accuracy mean absolute percentage error (MAPE) and comparison on linear chart. Preliminary results allows to foresee the potential of the presented technique.Keywords: clustering, data analysis, data mining, predictive models
Procedia PDF Downloads 4661116 Automated Detection of Women Dehumanization in English Text
Authors: Maha Wiss, Wael Khreich
Abstract:
Animals, objects, foods, plants, and other non-human terms are commonly used as a source of metaphors to describe females in formal and slang language. Comparing women to non-human items not only reflects cultural views that might conceptualize women as subordinates or in a lower position than humans, yet it conveys this degradation to the listeners. Moreover, the dehumanizing representation of females in the language normalizes the derogation and even encourages sexism and aggressiveness against women. Although dehumanization has been a popular research topic for decades, according to our knowledge, no studies have linked women's dehumanizing language to the machine learning field. Therefore, we introduce our research work as one of the first attempts to create a tool for the automated detection of the dehumanizing depiction of females in English texts. We also present the first labeled dataset on the charted topic, which is used for training supervised machine learning algorithms to build an accurate classification model. The importance of this work is that it accomplishes the first step toward mitigating dehumanizing language against females.Keywords: gender bias, machine learning, NLP, women dehumanization
Procedia PDF Downloads 801115 Sentence Structure for Free Word Order Languages in Context with Anaphora Resolution: A Case Study of Hindi
Authors: Pardeep Singh, Kamlesh Dutta
Abstract:
Many languages have fixed sentence structure and others are free word order. The accuracy of anaphora resolution of syntax based algorithm depends on structure of the sentence. So, it is important to analyze the structure of any language before implementing these algorithms. In this study, we analyzed the sentence structure exploiting the case marker in Hindi as well as some special tag for subject and object. We also investigated the word order for Hindi. Word order typology refers to the study of the order of the syntactic constituents of a language. We analyzed 165 news items of Ranchi Express from EMILEE corpus of plain text. It consisted of 1745 sentences. Eight file of dialogue based from the same corpus has been analyzed which will have 1521 sentences. The percentages of subject object verb structure (SOV) and object subject verb (OSV) are 66.90 and 33.10, respectively.Keywords: anaphora resolution, free word order languages, SOV, OSV
Procedia PDF Downloads 4731114 Cotton Crops Vegetative Indices Based Assessment Using Multispectral Images
Authors: Muhammad Shahzad Shifa, Amna Shifa, Muhammad Omar, Aamir Shahzad, Rahmat Ali Khan
Abstract:
Many applications of remote sensing to vegetation and crop response depend on spectral properties of individual leaves and plants. Vegetation indices are usually determined to estimate crop biophysical parameters like crop canopies and crop leaf area indices with the help of remote sensing. Cotton crops assessment is performed with the help of vegetative indices. Remotely sensed images from an optical multispectral radiometer MSR5 are used in this study. The interpretation is based on the fact that different materials reflect and absorb light differently at different wavelengths. Non-normalized and normalized forms of these datasets are analyzed using two complementary data mining algorithms; K-means and K-nearest neighbor (KNN). Our analysis shows that the use of normalized reflectance data and vegetative indices are suitable for an automated assessment and decision making.Keywords: cotton, condition assessment, KNN algorithm, clustering, MSR5, vegetation indices
Procedia PDF Downloads 3331113 Improving the Efficiency of a High Pressure Turbine by Using Non-Axisymmetric Endwall: A Comparison of Two Optimization Algorithms
Authors: Abdul Rehman, Bo Liu
Abstract:
Axial flow turbines are commonly designed with high loads that generate strong secondary flows and result in high secondary losses. These losses contribute to almost 30% to 50% of the total losses. Non-axisymmetric endwall profiling is one of the passive control technique to reduce the secondary flow loss. In this paper, the non-axisymmetric endwall profile construction and optimization for the stator endwalls are presented to improve the efficiency of a high pressure turbine. The commercial code NUMECA Fine/ Design3D coupled with Fine/Turbo was used for the numerical investigation, design of experiments and the optimization. All the flow simulations were conducted by using steady RANS and Spalart-Allmaras as a turbulence model. The non-axisymmetric endwalls of stator hub and shroud were created by using the perturbation law based on Bezier Curves. Each cut having multiple control points was supposed to be created along the virtual streamlines in the blade channel. For the design of experiments, each sample was arbitrarily generated based on values automatically chosen for the control points defined during parameterization. The Optimization was achieved by using two algorithms i.e. the stochastic algorithm and gradient-based algorithm. For the stochastic algorithm, a genetic algorithm based on the artificial neural network was used as an optimization method in order to achieve the global optimum. The evaluation of the successive design iterations was performed using artificial neural network prior to the flow solver. For the second case, the conjugate gradient algorithm with a three dimensional CFD flow solver was used to systematically vary a free-form parameterization of the endwall. This method is efficient and less time to consume as it requires derivative information of the objective function. The objective function was to maximize the isentropic efficiency of the turbine by keeping the mass flow rate as constant. The performance was quantified by using a multi-objective function. Other than these two classifications of the optimization methods, there were four optimizations cases i.e. the hub only, the shroud only, and the combination of hub and shroud. For the fourth case, the shroud endwall was optimized by using the optimized hub endwall geometry. The hub optimization resulted in an increase in the efficiency due to more homogenous inlet conditions for the rotor. The adverse pressure gradient was reduced but the total pressure loss in the vicinity of the hub was increased. The shroud optimization resulted in an increase in efficiency, total pressure loss and entropy were reduced. The combination of hub and shroud did not show overwhelming results which were achieved for the individual cases of the hub and the shroud. This may be caused by fact that there were too many control variables. The fourth case of optimization showed the best result because optimized hub was used as an initial geometry to optimize the shroud. The efficiency was increased more than the individual cases of optimization with a mass flow rate equal to the baseline design of the turbine. The results of artificial neural network and conjugate gradient method were compared.Keywords: artificial neural network, axial turbine, conjugate gradient method, non-axisymmetric endwall, optimization
Procedia PDF Downloads 2251112 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning
Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana
Abstract:
Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning
Procedia PDF Downloads 361111 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 3231110 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies
Authors: Yuanjin Liu
Abstract:
Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model
Procedia PDF Downloads 741109 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue
Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni
Abstract:
Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM
Procedia PDF Downloads 3311108 An Investigation Into an Essential Property of Creativity, Which Is the First-Person Experience
Authors: Ukpaka Paschal
Abstract:
Margret Boden argues that a creative product is one that is new, surprising, and valuable as a result of the combination, exploration, or transformation involved in producing it. Boden uses examples of artificial intelligence systems that fit all of these criteria and argues that real creativity involves autonomy, intentionality, valuation, emotion, and consciousness. This paper provides an analysis of all these elements in order to try to understand whether they are sufficient to account for creativity, especially human creativity. This paper focuses on Generative Adversarial Networks (GANs), which is a class of artificial intelligence algorithms that are said to have disproved the common perception that creativity is something that only humans possess. This paper will then argue that Boden’s listed properties of creativity, which capture the creativity exhibited by GANs, are not sufficient to account for human creativity, and this paper will further identify “first-person phenomenological experience” as an essential property of human creativity. The rationale behind the proposed essential property is that if creativity involves comprehending our experience of the world around us into a form of self-expression, then our experience of the world really matters with regard to creativity.Keywords: artificial intelligence, creativity, GANs, first-person experience
Procedia PDF Downloads 1351107 Exploring the Applications of Modular Forms in Cryptography
Authors: Berhane Tewelday Weldhiwot
Abstract:
This research investigates the pivotal role of modular forms in modern cryptographic systems, particularly focusing on their applications in secure communications and data integrity. Modular forms, which are complex analytic functions with rich arithmetic properties, have gained prominence due to their connections to number theory and algebraic geometry. This study begins by outlining the fundamental concepts of modular forms and their historical development, followed by a detailed examination of their applications in cryptographic protocols such as elliptic curve cryptography and zero-knowledge proofs. By employing techniques from analytic number theory, the research delves into how modular forms can enhance the efficiency and security of cryptographic algorithms. The findings suggest that leveraging modular forms not only improves computational performance but also fortifies security measures against emerging threats in digital communication. This work aims to contribute to the ongoing discourse on integrating advanced mathematical theories into practical applications, ultimately fostering innovation in cryptographic methodologies.Keywords: modular forms, cryptography, elliptic curves, applications, mathematical theory
Procedia PDF Downloads 161106 Generalized π-Armendariz Authentication Cryptosystem
Authors: Areej M. Abduldaim, Nadia M. G. Al-Saidi
Abstract:
Algebra is one of the important fields of mathematics. It concerns with the study and manipulation of mathematical symbols. It also concerns with the study of abstractions such as groups, rings, and fields. Due to the development of these abstractions, it is extended to consider other structures, such as vectors, matrices, and polynomials, which are non-numerical objects. Computer algebra is the implementation of algebraic methods as algorithms and computer programs. Recently, many algebraic cryptosystem protocols are based on non-commutative algebraic structures, such as authentication, key exchange, and encryption-decryption processes are adopted. Cryptography is the science that aimed at sending the information through public channels in such a way that only an authorized recipient can read it. Ring theory is the most attractive category of algebra in the area of cryptography. In this paper, we employ the algebraic structure called skew -Armendariz rings to design a neoteric algorithm for zero knowledge proof. The proposed protocol is established and illustrated through numerical example, and its soundness and completeness are proved.Keywords: cryptosystem, identification, skew π-Armendariz rings, skew polynomial rings, zero knowledge protocol
Procedia PDF Downloads 2171105 Detecting and Disabling Digital Cameras Using D3CIP Algorithm Based on Image Processing
Authors: S. Vignesh, K. S. Rangasamy
Abstract:
The paper deals with the device capable of detecting and disabling digital cameras. The system locates the camera and then neutralizes it. Every digital camera has an image sensor known as a CCD, which is retro-reflective and sends light back directly to its original source at the same angle. The device shines infrared LED light, which is invisible to the human eye, at a distance of about 20 feet. It then collects video of these reflections with a camcorder. Then the video of the reflections is transferred to a computer connected to the device, where it is sent through image processing algorithms that pick out infrared light bouncing back. Once the camera is detected, the device would project an invisible infrared laser into the camera's lens, thereby overexposing the photo and rendering it useless. Low levels of infrared laser neutralize digital cameras but are neither a health danger to humans nor a physical damage to cameras. We also discuss the simplified design of the above device that can used in theatres to prevent piracy. The domains being covered here are optics and image processing.Keywords: CCD, optics, image processing, D3CIP
Procedia PDF Downloads 3571104 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Authors: Belkacem Laimouche
Abstract:
With the field of artificial intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.Keywords: artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, interlaboratory comparison, data analysis, data reliability, measurement of bias impact on predictions, improvement of model accuracy and reliability
Procedia PDF Downloads 1051103 Scientific Recommender Systems Based on Neural Topic Model
Authors: Smail Boussaadi, Hassina Aliane
Abstract:
With the rapid growth of scientific literature, it is becoming increasingly challenging for researchers to keep up with the latest findings in their fields. Academic, professional networks play an essential role in connecting researchers and disseminating knowledge. To improve the user experience within these networks, we need effective article recommendation systems that provide personalized content.Current recommendation systems often rely on collaborative filtering or content-based techniques. However, these methods have limitations, such as the cold start problem and difficulty in capturing semantic relationships between articles. To overcome these challenges, we propose a new approach that combines BERTopic (Bidirectional Encoder Representations from Transformers), a state-of-the-art topic modeling technique, with community detection algorithms in a academic, professional network. Experiences confirm our performance expectations by showing good relevance and objectivity in the results.Keywords: scientific articles, community detection, academic social network, recommender systems, neural topic model
Procedia PDF Downloads 971102 Hyper Tuned RBF SVM: Approach for the Prediction of the Breast Cancer
Authors: Surita Maini, Sanjay Dhanka
Abstract:
Machine learning (ML) involves developing algorithms and statistical models that enable computers to learn and make predictions or decisions based on data without being explicitly programmed. Because of its unlimited abilities ML is gaining popularity in medical sectors; Medical Imaging, Electronic Health Records, Genomic Data Analysis, Wearable Devices, Disease Outbreak Prediction, Disease Diagnosis, etc. In the last few decades, many researchers have tried to diagnose Breast Cancer (BC) using ML, because early detection of any disease can save millions of lives. Working in this direction, the authors have proposed a hybrid ML technique RBF SVM, to predict the BC in earlier the stage. The proposed method is implemented on the Breast Cancer UCI ML dataset with 569 instances and 32 attributes. The authors recorded performance metrics of the proposed model i.e., Accuracy 98.24%, Sensitivity 98.67%, Specificity 97.43%, F1 Score 98.67%, Precision 98.67%, and run time 0.044769 seconds. The proposed method is validated by K-Fold cross-validation.Keywords: breast cancer, support vector classifier, machine learning, hyper parameter tunning
Procedia PDF Downloads 671101 Tree Species Classification Using Effective Features of Polarimetric SAR and Hyperspectral Images
Authors: Milad Vahidi, Mahmod R. Sahebi, Mehrnoosh Omati, Reza Mohammadi
Abstract:
Forest management organizations need information to perform their work effectively. Remote sensing is an effective method to acquire information from the Earth. Two datasets of remote sensing images were used to classify forested regions. Firstly, all of extractable features from hyperspectral and PolSAR images were extracted. The optical features were spectral indexes related to the chemical, water contents, structural indexes, effective bands and absorption features. Also, PolSAR features were the original data, target decomposition components, and SAR discriminators features. Secondly, the particle swarm optimization (PSO) and the genetic algorithms (GA) were applied to select optimization features. Furthermore, the support vector machine (SVM) classifier was used to classify the image. The results showed that the combination of PSO and SVM had higher overall accuracy than the other cases. This combination provided overall accuracy about 90.56%. The effective features were the spectral index, the bands in shortwave infrared (SWIR) and the visible ranges and certain PolSAR features.Keywords: hyperspectral, PolSAR, feature selection, SVM
Procedia PDF Downloads 4161100 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning
Procedia PDF Downloads 5501099 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms
Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee
Abstract:
Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences
Procedia PDF Downloads 2741098 Multi-Subpopulation Genetic Algorithm with Estimation of Distribution Algorithm for Textile Batch Dyeing Scheduling Problem
Authors: Nhat-To Huynh, Chen-Fu Chien
Abstract:
Textile batch dyeing scheduling problem is complicated which includes batch formation, batch assignment on machines, batch sequencing with sequence-dependent setup time. Most manufacturers schedule their orders manually that are time consuming and inefficient. More power methods are needed to improve the solution. Motivated by the real needs, this study aims to propose approaches in which genetic algorithm is developed with multi-subpopulation and hybridised with estimation of distribution algorithm to solve the constructed problem for minimising the makespan. A heuristic algorithm is designed and embedded into the proposed algorithms to improve the ability to get out of the local optima. In addition, an empirical study is conducted in a textile company in Taiwan to validate the proposed approaches. The results have showed that proposed approaches are more efficient than simulated annealing algorithm.Keywords: estimation of distribution algorithm, genetic algorithm, multi-subpopulation, scheduling, textile dyeing
Procedia PDF Downloads 2991097 The Optimization Process of Aortic Heart Valve Stent Geometry
Authors: Arkadiusz Mezyk, Wojciech Klein, Mariusz Pawlak, Jacek Gnilka
Abstract:
The aortic heart valve stents should fulfill many criterions. These criteria have a strong impact on the geometrical shape of the stent. Usually, the final construction of stent is a result of many year experience and knowledge. Depending on patents claims, different stent shapes are produced by different companies. This causes difficulties for biomechanics engineers narrowing the domain of feasible solutions. The paper present optimization method for stent geometry defining by a specific analytical equation based on various mathematical functions. This formula was implemented as APDL script language in ANSYS finite element environment. For the purpose of simulation tests, a few parameters were separated from developed equation. The application of the genetic algorithms allows finding the best solution due to selected objective function. Obtained solution takes into account parameters such as radial force, compression ratio and coefficient of expansion on the transverse axial.Keywords: aortic stent, optimization process, geometry, finite element method
Procedia PDF Downloads 2811096 Discovering New Organic Materials through Computational Methods
Authors: Lucas Viani, Benedetta Mennucci, Soo Young Park, Johannes Gierschner
Abstract:
Organic semiconductors have attracted the attention of the scientific community in the past decades due to their unique physicochemical properties, allowing new designs and alternative device fabrication methods. Until today, organic electronic devices are largely based on conjugated polymers mainly due to their easy processability. In the recent years, due to moderate ET and CT efficiencies and the ill-defined nature of polymeric systems the focus has been shifting to small conjugated molecules with well-defined chemical structure, easier control of intermolecular packing, and enhanced CT and ET properties. It has led to the synthesis of new small molecules, followed by the growth of their crystalline structure and ultimately by the device preparation. This workflow is commonly followed without a clear knowledge of the ET and CT properties related mainly to the macroscopic systems, which may lead to financial and time losses, since not all materials will deliver the properties and efficiencies demanded by the current standards. In this work, we present a theoretical workflow designed to predict the key properties of ET of these new materials prior synthesis, thus speeding up the discovery of new promising materials. It is based on quantum mechanical, hybrid, and classical methodologies, starting from a single molecule structure, finishing with the prediction of its packing structure, and prediction of properties of interest such as static and averaged excitonic couplings, and exciton diffusion length.Keywords: organic semiconductor, organic crystals, energy transport, excitonic couplings
Procedia PDF Downloads 2531095 Using A Blockchain-Based, End-to-End Encrypted Communication System Between Mobile Terminals to Improve Organizational Privacy
Authors: Andrei Bogdan Stanescu, Robert Stana
Abstract:
Creating private and secure communication channels between employees has become a critical aspect in order to ensure organizational integrity and avoid leaks of sensitive information. With the widespread use of modern methods of disrupting communication between users, real use-cases of advanced encryption mechanisms have emerged to avoid cyber-attackers that are willing to intercept private conversations between critical employees in an organization. This paper aims to present a custom implementation of a messaging application named “Whisper” that uses end-to-end encryption (E2EE) mechanisms and blockchain-related components to protect sensitive conversations and mitigate the risks of information breaches inside organizations. The results of this research paper aim to expand the areas of applicability of E2EE algorithms and integrations with private blockchains in chat applications as a viable method of enhancing intra-organizational communication privacy.Keywords: end-to-end encryption, mobile communication, cryptography, communication security, data privacy
Procedia PDF Downloads 891094 The Role of Artificial Intelligence Algorithms in Decision-Making Policies
Authors: Marisa Almeida AraúJo
Abstract:
Artificial intelligence (AI) tools are being used (including in the criminal justice system) and becomingincreasingly popular. The many questions that these (future) super-beings pose the neuralgic center is rooted in the (old) problematic between rationality and morality. For instance, if we follow a Kantian perspective in which morality derives from AI, rationality will also surpass man in ethical and moral standards, questioning the nature of mind, the conscience of self and others, and moral. The recognition of superior intelligence in a non-human being puts us in the contingency of having to recognize a pair in a form of new coexistence and social relationship. Just think of the humanoid robot Sophia, capable of reasoning and conversation (and who has been recognized for Saudi citizenship; a fact that symbolically demonstrates our empathy with the being). Machines having a more intelligent mind, and even, eventually, with higher ethical standards to which, in the alluded categorical imperative, we would have to subject ourselves under penalty of contradiction with the universal Kantian law. Recognizing the complex ethical and legal issues and the significant impact on human rights and democratic functioning itself is the goal of our work.Keywords: ethics, artificial intelligence, legal rules, principles, philosophy
Procedia PDF Downloads 1981093 Exhaustive Study of Essential Constraint Satisfaction Problem Techniques Based on N-Queens Problem
Authors: Md. Ahsan Ayub, Kazi A. Kalpoma, Humaira Tasnim Proma, Syed Mehrab Kabir, Rakib Ibna Hamid Chowdhury
Abstract:
Constraint Satisfaction Problem (CSP) is observed in various applications, i.e., scheduling problems, timetabling problems, assignment problems, etc. Researchers adopt a CSP technique to tackle a certain problem; however, each technique follows different approaches and ways to solve a problem network. In our exhaustive study, it has been possible to visualize the processes of essential CSP algorithms from a very concrete constraint satisfaction example, NQueens Problem, in order to possess a deep understanding about how a particular constraint satisfaction problem will be dealt with by our studied and implemented techniques. Besides, benchmark results - time vs. value of N in N-Queens - have been generated from our implemented approaches, which help understand at what factor each algorithm produces solutions; especially, in N-Queens puzzle. Thus, extended decisions can be made to instantiate a real life problem within CSP’s framework.Keywords: arc consistency (AC), backjumping algorithm (BJ), backtracking algorithm (BT), constraint satisfaction problem (CSP), forward checking (FC), least constrained values (LCV), maintaining arc consistency (MAC), minimum remaining values (MRV), N-Queens problem
Procedia PDF Downloads 3641092 Housing Price Prediction Using Machine Learning Algorithms: The Case of Melbourne City, Australia
Authors: The Danh Phan
Abstract:
House price forecasting is a main topic in the real estate market research. Effective house price prediction models could not only allow home buyers and real estate agents to make better data-driven decisions but may also be beneficial for the property policymaking process. This study investigates the housing market by using machine learning techniques to analyze real historical house sale transactions in Australia. It seeks useful models which could be deployed as an application for house buyers and sellers. Data analytics show a high discrepancy between the house price in the most expensive suburbs and the most affordable suburbs in the city of Melbourne. In addition, experiments demonstrate that the combination of Stepwise and Support Vector Machine (SVM), based on the Mean Squared Error (MSE) measurement, consistently outperforms other models in terms of prediction accuracy.Keywords: house price prediction, regression trees, neural network, support vector machine, stepwise
Procedia PDF Downloads 231