Search results for: gp-closed sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1227

Search results for: gp-closed sets

717 Integrated Approach of Quality Function Deployment, Sensitivity Analysis and Multi-Objective Linear Programming for Business and Supply Chain Programs Selection

Authors: T. T. Tham

Abstract:

The aim of this study is to propose an integrated approach to determine the most suitable programs, based on Quality Function Deployment (QFD), Sensitivity Analysis (SA) and Multi-Objective Linear Programming model (MOLP). Firstly, QFD is used to determine business requirements and transform them into business and supply chain programs. From the QFD, technical scores of all programs are obtained. All programs are then evaluated through five criteria (productivity, quality, cost, technical score, and feasibility). Sets of weight of these criteria are built using Sensitivity Analysis. Multi-Objective Linear Programming model is applied to select suitable programs according to multiple conflicting objectives under a budget constraint. A case study from the Sai Gon-Mien Tay Beer Company is given to illustrate the proposed methodology. The outcome of the study provides a comprehensive picture for companies to select suitable programs to obtain the optimal solution according to their preference.

Keywords: business program, multi-objective linear programming model, quality function deployment, sensitivity analysis, supply chain management

Procedia PDF Downloads 104
716 Evaluation of Environmental Impact Assessment of Dam Using GIS/Remote Sensing-Review

Authors: Ntungamili Kenosi, Moatlhodi W. Letshwenyo

Abstract:

Negative environmental impacts due to construction of large projects such as dams have become an important aspect of land degradation. This paper will review the previous literature on the previous researches or study in the same area of study in the other parts of the world. After dam has been constructed, the actual environmental impacts are investigated and compared to the predicted results of the carried out Environmental Impact Assessment. GIS and Remote Sensing, play an important role in generating automated spatial data sets and in establishing spatial relationships. Results from other sources shows that the normalized vegetation index (NDVI) analysis was used to detect the spatial and temporal change of vegetation biomass in the study area. The result indicated that the natural vegetation biomass is declining. This is mainly due to the expansion of agricultural land and escalating human made structures in the area. Urgent environmental conservation is necessary when adjoining projects site. Less study on the evaluation of EIA on dam has been conducted in Botswana hence there is a need for the same study to be conducted and then it will be easy to be compared to other studies around the world.

Keywords: Botswana, dam, environmental impact assessment, GIS, normalized vegetation index (NDVI), remote sensing

Procedia PDF Downloads 389
715 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 431
714 Improving Forecasting Demand for Maintenance Spare Parts: Case Study

Authors: Abdulaziz Afandi

Abstract:

Minimizing the inventory cost, optimizing the inventory quantities, and increasing system operational availability are the main motivations to enhance forecasting demand of spare parts in a major power utility company in Medina. This paper reports in an effort made to optimize the orders quantities of spare parts by improving the method of forecasting the demand. The study focuses on equipment that has frequent spare parts purchase orders with uncertain demand. The pattern of the demand considers a lumpy pattern which makes conventional forecasting methods less effective. A comparison was made by benchmarking various methods of forecasting based on experts’ criteria to select the most suitable method for the case study. Three actual data sets were used to make the forecast in this case study. Two neural networks (NN) approaches were utilized and compared, namely long short-term memory (LSTM) and multilayer perceptron (MLP). The results as expected, showed that the NN models gave better results than traditional forecasting method (judgmental method). In addition, the LSTM model had a higher predictive accuracy than the MLP model.

Keywords: neural network, LSTM, MLP, forecasting demand, inventory management

Procedia PDF Downloads 105
713 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network

Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.

Keywords: big data, k-NN, machine learning, traffic speed prediction

Procedia PDF Downloads 336
712 The Principles of Democracy and Development: The Political and Philosophical Foundations of Development-Democracy in Africa

Authors: Fadeke Olu-Owolabi, Fayomi Oluyemi

Abstract:

The political and societal orders face the awesome task of overcoming the difficulties which lead to growing tensions and conflicts in Africa. At the core of analysis is the question, how stable and adaptable are established democracies, new democracies, and political and societal actors? The idea of development-democracy as implying the strong linkage between economic development and political democracy appropriately describes the distinguishing characteristic of this new demand for democracy in Africa. The theoretical study examines the political and philosophical foundation of the idea of development-democracy and the arguments presented to support the need for its adoption in Africa today. This paper critically examines the polemic between the advocates of developmental dictatorship and developmental-democracy and argues for the adoption of the latter in Africa. The paper sets out to expose for the political and philosophical foundation of developmental democracy maintaining that only democracy can facilitate development. This argument is supported further by the claim that both democracy and development are two sides of the same coin in the sense that the two are both ethical concepts. The paper also maintained that the only way by which democracy is worthwhile is when it is developmental. Finally, the paper affirms that since the two concepts of democracy and development are like the Siamese twins then the way out of Africa’s present crisis of development is to wholeheartedly embrace democracy. It posits that when genuine democracy is adopted, genuine and sustainable development can then be attained.

Keywords: democracy, development, polemic, principles

Procedia PDF Downloads 488
711 The Principles of Democracy and Development: The Political and Philosophical Foundations of Development-Development in Africa

Authors: Fadeke E. Olu-Owolabi, Fayomi Oluyemi

Abstract:

The political and societal orders face the awesome task of overcoming the difficulties which lead to growing tensions and conflicts in Africa. At the core of analysis is the question, how stable and adaptable are established democracies, new democracies, and political and societal actors? The idea of development-democracy as implying the strong linkage between economic development and political democracy appropriately describes the distinguishing characteristic of this new demand for democracy in Africa. The theoretical study examines the political and philosophical foundation of the idea of development-democracy and the arguments presented to support the need for its adoption in Africa today. This paper critically examines the polemic between the advocates of developmental dictatorship and developmental-democracy and argues for the adoption of the latter in Africa. The paper sets out to expose for the political and philosophical foundation of developmental democracy maintaining that only democracy can facilitate development. This argument is supported further by the claim that both democracy and development are two sides of the same coin in the sense that the two are both ethical concepts. The paper also maintained that the only way by which democracy is worthwhile is when it is developmental. Finally the paper affirms that since the two concepts of democracy and development are like the Siamese twins then the way out of Africa’s present crisis of development is to wholeheartedly embrace democracy. It posits that when genuine democracy is adopted, genuine and sustainable development can then be attained.

Keywords: democracy, development, polemic, principles

Procedia PDF Downloads 409
710 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, Bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 422
709 Investment Projects Selection Problem under Hesitant Fuzzy Environment

Authors: Irina Khutsishvili

Abstract:

In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations, since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.

Keywords: In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.

Procedia PDF Downloads 101
708 Influence of Season, Temperature, and Photoperiod on Growth of the Land Snail Helix aperta

Authors: S. Benbellil-Tafoughalt, J. M. Koene

Abstract:

Growth strategies are often plastic and influenced by environmental conditions. Terrestrial gastropods are particularly affected by seasonal and climatic variables, and growth rate and size at maturity are key traits in their life history. Therefore, we investigated juvenile growth of Helix aperta snails under four combinations of temperature and photoperiod using two sets of young snails, born in the laboratory from adults collected in either the autumn (aestivating snails) or spring (active snails). Parental snails were collected from Bakaro (Northeastern Algeria). Higher temperature increased adult size and reduced time to reproduction. Long day photoperiod also increased the final body weight, but had no effect on the length of the growth period. The season of birth had significant effects on length of the growth period and weight of hatchings, whereas this weight difference disappeared by adulthood. The spring snails took less time to develop and reached similar adult body weight as the autumn snails. These differences may be due to differences in egg size or quality between the snails from different seasons. More rapid growth in spring snails results in larger snails entering aestivation, a period with size-related mortality in this species.

Keywords: growth, Hélix aperta, photoperiod, temperature

Procedia PDF Downloads 318
707 Tracking Filtering Algorithm Based on ConvLSTM

Authors: Ailing Yang, Penghan Song, Aihua Cai

Abstract:

The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.

Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention

Procedia PDF Downloads 133
706 Flood-Induced River Disruption: Geomorphic Imprints and Topographic Effects in Kelantan River Catchment from Kemubu to Kuala Besar, Kelantan, Malaysia

Authors: Mohamad Muqtada Ali Khan, Nor Ashikin Shaari, Donny Adriansyah bin Nazaruddin, Hafzan Eva Bt Mansoor

Abstract:

Floods play a key role in landform evolution of an area. This process is likely to alter the topography of the earth’s surface. The present study area, Kota Bharu is very prone to floods extends from upstream of Kelantan River near Kemubu to the downstream area near Kuala Besar. These flood events which occur every year in the study area exhibit a strong bearing on river morphological set-up. In the present study, three satellite imageries of different time periods have been used to manifest the post-flood landform changes. The pre-processing of the images such as subset, geometric corrections and atmospheric corrections were carried-out using ENVI 4.5 followed by the analysis processes. Twenty sets of cross sections were plotted using software Erdas 9.2, ERDAS and ArcGis 10 for the all three images. The results show a significant change in the length of the cross section which suggest that the geomorphological processes play a key role in carving and shaping the river banks during the floods.

Keywords: flood induced, geomorphic imprints, Kelantan river, Malaysia

Procedia PDF Downloads 528
705 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks

Authors: Radhika Ranjan Roy

Abstract:

Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.

Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve

Procedia PDF Downloads 58
704 Wage Differentiation Patterns of Households Revisited for Turkey in Same Industry Employment: A Pseudo-Panel Approach

Authors: Yasin Kutuk, Bengi Yanik Ilhan

Abstract:

Previous studies investigate the wage differentiations among regions in Turkey between couples who work in the same industry and those who work in different industries by using the models that is appropriate for cross sectional data. However, since there is no available panel data for this investigation in Turkey, pseudo panels using repeated cross-section data sets of the Household Labor Force Surveys 2004-2014 are employed in order to open a new way to examine wage differentiation patterns. For this purpose, household heads are separated into groups with respect to their household composition. These groups’ membership is assumed to be fixed over time such as age groups, education, gender, and NUTS1 (12 regions) Level. The average behavior of them can be tracked overtime same as in the panel data. Estimates using the pseudo panel data would be consistent with the estimates using genuine panel data on individuals if samples are representative of the population which has fixed composition, characteristics. With controlling the socioeconomic factors, wage differentiation of household income is affected by social, cultural and economic changes after global economic crisis emerged in US. It is also revealed whether wage differentiation is changing among the birth cohorts.

Keywords: wage income, same industry, pseudo panel, panel data econometrics

Procedia PDF Downloads 378
703 Formation of Academia-Industry Collaborative Model to Improve the Quality of Teaching-Learning Process

Authors: M. Dakshayini, P. Jayarekha

Abstract:

In traditional output-based education system, class room lecture and laboratory are the traditional delivery methods used during the course. Written examination and lab examination have been used as a conventional tool for evaluating student’s performance. Hence, there are certain apprehensions that the traditional education system may not efficiently prepare the students for competent professional life. This has led for the change from Traditional output-based education to Outcome-Based Education (OBE). OBE first sets the ideal programme learning outcome consecutively on increasing degree of complexity that students are expected to master. The core curriculum, teaching methodologies and assessment tools are then designed to achieve the proposed outcomes mainly focusing on what students can actually attain after they are taught. In this paper, we discuss a promising applications based learning and evaluation component involving industry collaboration to improve the quality of teaching and student learning process. Incorporation of this component definitely improves the quality of student learning in engineering education and helps the student to attain the competency as per the graduate attributes. This may also reduce the Industry-academia gap.

Keywords: outcome-based education, programme learning outcome, teaching-learning process, evaluation, industry collaboration

Procedia PDF Downloads 430
702 Research on Optimization Strategies for the Negative Space of Urban Rail Transit Based on Urban Public Art Planning

Authors: Kexin Chen

Abstract:

As an important method of transportation to solve the demand and supply contradiction generated in the rapid urbanization process, urban rail traffic system has been rapidly developed over the past ten years in China. During the rapid development, the space of urban rail Transit has encountered many problems, such as space simplification, sensory experience dullness, and poor regional identification, etc. This paper, focus on the study of the negative space of subway station and spatial softening, by comparing and learning from foreign cases. The article sorts out cases at home and abroad, make a comparative study of the cases, analysis more diversified setting of public art, and sets forth propositions on the domestic type of public art in the space of urban rail transit for reference, then shows the relationship of the spatial attribute in the space of urban rail transit and public art form. In this foundation, it aims to characterize more diverse setting ways for public art; then suggests the three public art forms corresponding properties, such as static presenting mode, dynamic image mode, and spatial softening mode; finds out the method of urban public art to optimize negative space.

Keywords: diversification, negative space, optimization strategy, public art planning

Procedia PDF Downloads 189
701 Global City Typologies: 300 Cities and Over 100 Datasets

Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans

Abstract:

Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.

Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling

Procedia PDF Downloads 166
700 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.

Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography

Procedia PDF Downloads 139
699 Analyzing Extended Reality Technologies for Human Space Exploration

Authors: Morgan Kuligowski, Marientina Gotsis

Abstract:

Extended reality (XR) technologies share an intertwined history with spaceflight and innovation. New advancements in XR technologies offer expanding possibilities to advance the future of human space exploration with increased crew autonomy. This paper seeks to identify implementation gaps between existing and proposed XR space applications to inform future mission planning. A review of virtual reality, augmented reality, and mixed reality technologies implemented aboard the International Space Station revealed a total of 16 flown investigations. A secondary set of ground-tested XR human spaceflight applications were systematically retrieved from literature sources. The two sets of XR technologies, those flown and those existing in the literature were analyzed to characterize application domains and device types. Comparisons between these groups revealed untapped application areas for XR to support crew psychological health, in-flight training, and extravehicular operations on future flights. To fill these roles, integrating XR technologies with advancements in biometric sensors and machine learning tools is expected to transform crew capabilities.

Keywords: augmented reality, extended reality, international space station, mixed reality, virtual reality

Procedia PDF Downloads 197
698 Minimum-Fuel Optimal Trajectory for Reusable First-Stage Rocket Landing Using Particle Swarm Optimization

Authors: Kevin Spencer G. Anglim, Zhenyu Zhang, Qingbin Gao

Abstract:

Reusable launch vehicles (RLVs) present a more environmentally-friendly approach to accessing space when compared to traditional launch vehicles that are discarded after each flight. This paper studies the recyclable nature of RLVs by presenting a solution method for determining minimum-fuel optimal trajectories using principles from optimal control theory and particle swarm optimization (PSO). This problem is formulated as a minimum-landing error powered descent problem where it is desired to move the RLV from a fixed set of initial conditions to three different sets of terminal conditions. However, unlike other powered descent studies, this paper considers the highly nonlinear effects caused by atmospheric drag, which are often ignored for studies on the Moon or on Mars. Rather than optimizing the controls directly, the throttle control is assumed to be bang-off-bang with a predetermined thrust direction for each phase of flight. The PSO method is verified in a one-dimensional comparison study, and it is then applied to the two-dimensional cases, the results of which are illustrated.

Keywords: minimum-fuel optimal trajectory, particle swarm optimization, reusable rocket, SpaceX

Procedia PDF Downloads 258
697 Heuristic Classification of Hydrophone Recordings

Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas

Abstract:

An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.

Keywords: anthrophony, hydrophone, k-means, machine learning

Procedia PDF Downloads 148
696 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network

Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang

Abstract:

As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.

Keywords: GUI, deep learning, GAN, data augmentation

Procedia PDF Downloads 164
695 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 400
694 Energy Efficient Clustering with Adaptive Particle Swarm Optimization

Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha

Abstract:

Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.

Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering

Procedia PDF Downloads 226
693 Investigating Safe Operation Condition for Iterative Learning Control under Load Disturbances Effect in Singular Values

Authors: Muhammad A. Alsubaie

Abstract:

An iterative learning control framework designed in state feedback structure suffers a lack in investigating load disturbance considerations. The presented work discusses the controller previously designed, highlights the disturbance problem, finds new conditions using singular value principle to assure safe operation conditions with error convergence and reference tracking under the influence of load disturbance. It is known that periodic disturbances can be represented by a delay model in a positive feedback loop acting on the system input. This model can be manipulated by isolating the delay model and finding a controller for the overall system around the delay model to remedy the periodic disturbances using the small signal theorem. The overall system is the base for control design and load disturbance investigation. The major finding of this work is the load disturbance condition found which clearly sets safe operation condition under the influence of load disturbances such that the error tends to nearly zero as the system keeps operating trial after trial.

Keywords: iterative learning control, singular values, state feedback, load disturbance

Procedia PDF Downloads 147
692 Left to Right-Right Most Parsing Algorithm with Lookahead

Authors: Jamil Ahmed

Abstract:

Left to Right-Right Most (LR) parsing algorithm is a widely used algorithm of syntax analysis. It is contingent on a parsing table, whereas the parsing tables are extracted from the grammar. The parsing table specifies the actions to be taken during parsing. It requires that the parsing table should have no action conflicts for the same input symbol. This requirement imposes a condition on the class of grammars over which the LR algorithms work. However, there are grammars for which the parsing tables hold action conflicts. In such cases, the algorithm needs a capability of scanning (looking-ahead) next input symbols ahead of the current input symbol. In this paper, a ‘Left to Right’-‘Right Most’ parsing algorithm with lookahead capability is introduced. The 'look-ahead' capability in the LR parsing algorithm is the major contribution of this paper. The practicality of the proposed algorithm is substantiated by the parser implementation of the Context Free Grammar (CFG) of an already proposed programming language 'State Controlled Object Oriented Programming' (SCOOP). SCOOP’s Context Free Grammar has 125 productions and 192 item sets. This algorithm parses SCOOP while the grammar requires to ‘look ahead’ the input symbols due to action conflicts in its parsing table. Proposed LR parsing algorithm with lookahead capability can be viewed as an optimization of ‘Simple Left to Right’-‘Right Most’ (SLR) parsing algorithm.

Keywords: left to right-right most parsing, syntax analysis, bottom-up parsing algorithm

Procedia PDF Downloads 98
691 Effect of Threshold Corrections on Proton Lifetime and Emergence of Topological Defects in Grand Unified Theories

Authors: Rinku Maji, Joydeep Chakrabortty, Stephen F. King

Abstract:

The grand unified theory (GUT) rationales the arbitrariness of the standard model (SM) and explains many enigmas of nature at the outset of a single gauge group. The GUTs predict the proton decay and, the spontaneous symmetry breaking (SSB) of the higher symmetry group may lead to the formation of topological defects, which are indispensable in the context of the cosmological observations. The Super-Kamiokande (Super-K) experiment sets sacrosanct bounds on the partial lifetime (τ) of the proton decay for different channels, e.g., τ(p → e+ π0) > 1.6×10³⁴ years which is the most relevant channel to test the viability of the nonsupersymmetric GUTs. The GUTs based on the gauge groups SO(10) and E(6) are broken to the SM spontaneously through one and two intermediate gauge symmetries with the manifestation of the left-right symmetry at least at a single intermediate stage and the proton lifetime for these breaking chains has been computed. The impact of the threshold corrections, as a consequence of integrating out the heavy fields at the breaking scale alter the running of the gauge couplings, which eventually, are found to keep many GUTs off the Super-K bound. The possible topological defects arising in the course of SSB at different breaking scales for all breaking chains have been studied.

Keywords: grand unified theories, proton decay, threshold correction, topological defects

Procedia PDF Downloads 152
690 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement

Authors: Rhadinia Tayag-Relanes, Felina C. Young

Abstract:

This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.

Keywords: production, continuous improvement, process, operations, PDCA

Procedia PDF Downloads 41
689 Damage Identification Using Experimental Modal Analysis

Authors: Niladri Sekhar Barma, Satish Dhandole

Abstract:

Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.

Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification

Procedia PDF Downloads 91
688 Theoretical Study of Acetylation of P-Methylaniline Catalyzed by Cu²⁺ Ions

Authors: Silvana Caglieri

Abstract:

Theoretical study of acetylation of p-methylaniline catalyzed by Cu2+ ions from the analysis of intermediate of the reaction was carried out. The study of acetylation of amines is of great interest by the utility of its products of reaction and is one of the most frequently used transformations in organic synthesis as it provides an efficient and inexpensive means for protecting amino groups in a multistep synthetic process. Acetylation of amine is a nucleophilic substitution reaction. This reaction can be catalyzed by Lewis acid, metallic ion. In reaction mechanism, the metallic ion formed a complex with the oxygen of the acetic anhydride carbonyl, facilitating the polarization of the same and the successive addition of amine at the position to form a tetrahedral intermediate, determining step of the rate of the reaction. Experimental work agreed that this reaction takes place with the formation of a tetrahedral intermediate. In the present theoretical work were investigated the structure and energy of the tetrahedral intermediate of the reaction catalyzed by Cu2+ ions. Geometries of all species involved in the acetylation were made and identified. All of the geometry optimizations were performed by the method at the DFT/B3LYP level of theory and the method MP2. Were adopted the 6-31+G* basis sets. Energies were calculated using the Mechanics-UFF method. Following the same procedure it was identified the geometric parameters and energy of reaction intermediate. The calculations show 61.35 kcal/mol of energy for the tetrahedral intermediate and the energy of activation for the reaction was 15.55 kcal/mol.

Keywords: amides, amines, DFT, MP2

Procedia PDF Downloads 259