Search results for: rough sets
739 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network
Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.Keywords: big data, k-NN, machine learning, traffic speed prediction
Procedia PDF Downloads 363738 Enhancing Knowledge Graph Convolutional Networks with Structural Adaptive Receptive Fields for Improved Node Representation and Information Aggregation
Authors: Zheng Zhihao
Abstract:
Recently, Knowledge Graph Framework Network (KGCN) has developed powerful capabilities in knowledge representation and reasoning tasks. However, traditional KGCN often uses a fixed weight mechanism when aggregating information, failing to make full use of rich structural information, resulting in a certain expression ability of node representation, and easily causing over-smoothing problems. In order to solve these challenges, the paper proposes an new graph neural network model called KGCN-STAR (Knowledge Graph Convolutional Network with Structural Adaptive Receptive Fields). This model dynamically adjusts the perception of each node by introducing a structural adaptive receptive field. wild range, and a subgraph aggregator is designed to capture local structural information more effectively. Experimental results show that KGCN-STAR shows significant performance improvement on multiple knowledge graph data sets, especially showing considerable capabilities in the task of representation learning of complex structures.Keywords: knowledge graph, graph neural networks, structural adaptive receptive fields, information aggregation
Procedia PDF Downloads 33737 The Principles of Democracy and Development: The Political and Philosophical Foundations of Development-Democracy in Africa
Authors: Fadeke Olu-Owolabi, Fayomi Oluyemi
Abstract:
The political and societal orders face the awesome task of overcoming the difficulties which lead to growing tensions and conflicts in Africa. At the core of analysis is the question, how stable and adaptable are established democracies, new democracies, and political and societal actors? The idea of development-democracy as implying the strong linkage between economic development and political democracy appropriately describes the distinguishing characteristic of this new demand for democracy in Africa. The theoretical study examines the political and philosophical foundation of the idea of development-democracy and the arguments presented to support the need for its adoption in Africa today. This paper critically examines the polemic between the advocates of developmental dictatorship and developmental-democracy and argues for the adoption of the latter in Africa. The paper sets out to expose for the political and philosophical foundation of developmental democracy maintaining that only democracy can facilitate development. This argument is supported further by the claim that both democracy and development are two sides of the same coin in the sense that the two are both ethical concepts. The paper also maintained that the only way by which democracy is worthwhile is when it is developmental. Finally, the paper affirms that since the two concepts of democracy and development are like the Siamese twins then the way out of Africa’s present crisis of development is to wholeheartedly embrace democracy. It posits that when genuine democracy is adopted, genuine and sustainable development can then be attained.Keywords: democracy, development, polemic, principles
Procedia PDF Downloads 528736 The Principles of Democracy and Development: The Political and Philosophical Foundations of Development-Development in Africa
Authors: Fadeke E. Olu-Owolabi, Fayomi Oluyemi
Abstract:
The political and societal orders face the awesome task of overcoming the difficulties which lead to growing tensions and conflicts in Africa. At the core of analysis is the question, how stable and adaptable are established democracies, new democracies, and political and societal actors? The idea of development-democracy as implying the strong linkage between economic development and political democracy appropriately describes the distinguishing characteristic of this new demand for democracy in Africa. The theoretical study examines the political and philosophical foundation of the idea of development-democracy and the arguments presented to support the need for its adoption in Africa today. This paper critically examines the polemic between the advocates of developmental dictatorship and developmental-democracy and argues for the adoption of the latter in Africa. The paper sets out to expose for the political and philosophical foundation of developmental democracy maintaining that only democracy can facilitate development. This argument is supported further by the claim that both democracy and development are two sides of the same coin in the sense that the two are both ethical concepts. The paper also maintained that the only way by which democracy is worthwhile is when it is developmental. Finally the paper affirms that since the two concepts of democracy and development are like the Siamese twins then the way out of Africa’s present crisis of development is to wholeheartedly embrace democracy. It posits that when genuine democracy is adopted, genuine and sustainable development can then be attained.Keywords: democracy, development, polemic, principles
Procedia PDF Downloads 435735 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images
Authors: S. Nandagopalan, N. Pradeep
Abstract:
The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.Keywords: active contour, Bayesian, echocardiographic image, feature vector
Procedia PDF Downloads 445734 Investment Projects Selection Problem under Hesitant Fuzzy Environment
Authors: Irina Khutsishvili
Abstract:
In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations, since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.Keywords: In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.
Procedia PDF Downloads 117733 Influence of Season, Temperature, and Photoperiod on Growth of the Land Snail Helix aperta
Authors: S. Benbellil-Tafoughalt, J. M. Koene
Abstract:
Growth strategies are often plastic and influenced by environmental conditions. Terrestrial gastropods are particularly affected by seasonal and climatic variables, and growth rate and size at maturity are key traits in their life history. Therefore, we investigated juvenile growth of Helix aperta snails under four combinations of temperature and photoperiod using two sets of young snails, born in the laboratory from adults collected in either the autumn (aestivating snails) or spring (active snails). Parental snails were collected from Bakaro (Northeastern Algeria). Higher temperature increased adult size and reduced time to reproduction. Long day photoperiod also increased the final body weight, but had no effect on the length of the growth period. The season of birth had significant effects on length of the growth period and weight of hatchings, whereas this weight difference disappeared by adulthood. The spring snails took less time to develop and reached similar adult body weight as the autumn snails. These differences may be due to differences in egg size or quality between the snails from different seasons. More rapid growth in spring snails results in larger snails entering aestivation, a period with size-related mortality in this species.Keywords: growth, Hélix aperta, photoperiod, temperature
Procedia PDF Downloads 336732 Tracking Filtering Algorithm Based on ConvLSTM
Authors: Ailing Yang, Penghan Song, Aihua Cai
Abstract:
The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention
Procedia PDF Downloads 176731 Flood-Induced River Disruption: Geomorphic Imprints and Topographic Effects in Kelantan River Catchment from Kemubu to Kuala Besar, Kelantan, Malaysia
Authors: Mohamad Muqtada Ali Khan, Nor Ashikin Shaari, Donny Adriansyah bin Nazaruddin, Hafzan Eva Bt Mansoor
Abstract:
Floods play a key role in landform evolution of an area. This process is likely to alter the topography of the earth’s surface. The present study area, Kota Bharu is very prone to floods extends from upstream of Kelantan River near Kemubu to the downstream area near Kuala Besar. These flood events which occur every year in the study area exhibit a strong bearing on river morphological set-up. In the present study, three satellite imageries of different time periods have been used to manifest the post-flood landform changes. The pre-processing of the images such as subset, geometric corrections and atmospheric corrections were carried-out using ENVI 4.5 followed by the analysis processes. Twenty sets of cross sections were plotted using software Erdas 9.2, ERDAS and ArcGis 10 for the all three images. The results show a significant change in the length of the cross section which suggest that the geomorphological processes play a key role in carving and shaping the river banks during the floods.Keywords: flood induced, geomorphic imprints, Kelantan river, Malaysia
Procedia PDF Downloads 545730 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks
Authors: Radhika Ranjan Roy
Abstract:
Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve
Procedia PDF Downloads 78729 Wage Differentiation Patterns of Households Revisited for Turkey in Same Industry Employment: A Pseudo-Panel Approach
Authors: Yasin Kutuk, Bengi Yanik Ilhan
Abstract:
Previous studies investigate the wage differentiations among regions in Turkey between couples who work in the same industry and those who work in different industries by using the models that is appropriate for cross sectional data. However, since there is no available panel data for this investigation in Turkey, pseudo panels using repeated cross-section data sets of the Household Labor Force Surveys 2004-2014 are employed in order to open a new way to examine wage differentiation patterns. For this purpose, household heads are separated into groups with respect to their household composition. These groups’ membership is assumed to be fixed over time such as age groups, education, gender, and NUTS1 (12 regions) Level. The average behavior of them can be tracked overtime same as in the panel data. Estimates using the pseudo panel data would be consistent with the estimates using genuine panel data on individuals if samples are representative of the population which has fixed composition, characteristics. With controlling the socioeconomic factors, wage differentiation of household income is affected by social, cultural and economic changes after global economic crisis emerged in US. It is also revealed whether wage differentiation is changing among the birth cohorts.Keywords: wage income, same industry, pseudo panel, panel data econometrics
Procedia PDF Downloads 397728 Formation of Academia-Industry Collaborative Model to Improve the Quality of Teaching-Learning Process
Authors: M. Dakshayini, P. Jayarekha
Abstract:
In traditional output-based education system, class room lecture and laboratory are the traditional delivery methods used during the course. Written examination and lab examination have been used as a conventional tool for evaluating student’s performance. Hence, there are certain apprehensions that the traditional education system may not efficiently prepare the students for competent professional life. This has led for the change from Traditional output-based education to Outcome-Based Education (OBE). OBE first sets the ideal programme learning outcome consecutively on increasing degree of complexity that students are expected to master. The core curriculum, teaching methodologies and assessment tools are then designed to achieve the proposed outcomes mainly focusing on what students can actually attain after they are taught. In this paper, we discuss a promising applications based learning and evaluation component involving industry collaboration to improve the quality of teaching and student learning process. Incorporation of this component definitely improves the quality of student learning in engineering education and helps the student to attain the competency as per the graduate attributes. This may also reduce the Industry-academia gap.Keywords: outcome-based education, programme learning outcome, teaching-learning process, evaluation, industry collaboration
Procedia PDF Downloads 449727 Research on Optimization Strategies for the Negative Space of Urban Rail Transit Based on Urban Public Art Planning
Authors: Kexin Chen
Abstract:
As an important method of transportation to solve the demand and supply contradiction generated in the rapid urbanization process, urban rail traffic system has been rapidly developed over the past ten years in China. During the rapid development, the space of urban rail Transit has encountered many problems, such as space simplification, sensory experience dullness, and poor regional identification, etc. This paper, focus on the study of the negative space of subway station and spatial softening, by comparing and learning from foreign cases. The article sorts out cases at home and abroad, make a comparative study of the cases, analysis more diversified setting of public art, and sets forth propositions on the domestic type of public art in the space of urban rail transit for reference, then shows the relationship of the spatial attribute in the space of urban rail transit and public art form. In this foundation, it aims to characterize more diverse setting ways for public art; then suggests the three public art forms corresponding properties, such as static presenting mode, dynamic image mode, and spatial softening mode; finds out the method of urban public art to optimize negative space.Keywords: diversification, negative space, optimization strategy, public art planning
Procedia PDF Downloads 207726 Global City Typologies: 300 Cities and Over 100 Datasets
Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans
Abstract:
Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling
Procedia PDF Downloads 180725 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea
Authors: Pavel Shcherban, Vlad Golovanov
Abstract:
Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography
Procedia PDF Downloads 157724 Analyzing Extended Reality Technologies for Human Space Exploration
Authors: Morgan Kuligowski, Marientina Gotsis
Abstract:
Extended reality (XR) technologies share an intertwined history with spaceflight and innovation. New advancements in XR technologies offer expanding possibilities to advance the future of human space exploration with increased crew autonomy. This paper seeks to identify implementation gaps between existing and proposed XR space applications to inform future mission planning. A review of virtual reality, augmented reality, and mixed reality technologies implemented aboard the International Space Station revealed a total of 16 flown investigations. A secondary set of ground-tested XR human spaceflight applications were systematically retrieved from literature sources. The two sets of XR technologies, those flown and those existing in the literature were analyzed to characterize application domains and device types. Comparisons between these groups revealed untapped application areas for XR to support crew psychological health, in-flight training, and extravehicular operations on future flights. To fill these roles, integrating XR technologies with advancements in biometric sensors and machine learning tools is expected to transform crew capabilities.Keywords: augmented reality, extended reality, international space station, mixed reality, virtual reality
Procedia PDF Downloads 216723 Minimum-Fuel Optimal Trajectory for Reusable First-Stage Rocket Landing Using Particle Swarm Optimization
Authors: Kevin Spencer G. Anglim, Zhenyu Zhang, Qingbin Gao
Abstract:
Reusable launch vehicles (RLVs) present a more environmentally-friendly approach to accessing space when compared to traditional launch vehicles that are discarded after each flight. This paper studies the recyclable nature of RLVs by presenting a solution method for determining minimum-fuel optimal trajectories using principles from optimal control theory and particle swarm optimization (PSO). This problem is formulated as a minimum-landing error powered descent problem where it is desired to move the RLV from a fixed set of initial conditions to three different sets of terminal conditions. However, unlike other powered descent studies, this paper considers the highly nonlinear effects caused by atmospheric drag, which are often ignored for studies on the Moon or on Mars. Rather than optimizing the controls directly, the throttle control is assumed to be bang-off-bang with a predetermined thrust direction for each phase of flight. The PSO method is verified in a one-dimensional comparison study, and it is then applied to the two-dimensional cases, the results of which are illustrated.Keywords: minimum-fuel optimal trajectory, particle swarm optimization, reusable rocket, SpaceX
Procedia PDF Downloads 277722 Heuristic Classification of Hydrophone Recordings
Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas
Abstract:
An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.Keywords: anthrophony, hydrophone, k-means, machine learning
Procedia PDF Downloads 170721 Mechanical Characterization and CNC Rotary Ultrasonic Grinding of Crystal Glass
Authors: Ricardo Torcato, Helder Morais
Abstract:
The manufacture of crystal glass parts is based on obtaining the rough geometry by blowing and/or injection, generally followed by a set of manual finishing operations using cutting and grinding tools. The forming techniques used do not allow the obtainment, with repeatability, of parts with complex shapes and the finishing operations use intensive specialized labor resulting in high cycle times and production costs. This work aims to explore the digital manufacture of crystal glass parts by investigating new subtractive techniques for the automated, flexible finishing of these parts. Finishing operations are essential to respond to customer demands in terms of crystal feel and shine. It is intended to investigate the applicability of different computerized finishing technologies, namely milling and grinding in a CNC machining center with or without ultrasonic assistance, to crystal processing. Research in the field of grinding hard and brittle materials, despite not being extensive, has increased in recent years, and scientific knowledge about the machinability of crystal glass is still very limited. However, it can be said that the unique properties of glass, such as high hardness and very low toughness, make any glass machining technology a very challenging process. This work will measure the performance improvement brought about by the use of ultrasound compared to conventional crystal grinding. This presentation is focused on the mechanical characterization and analysis of the cutting forces in CNC machining of superior crystal glass (Pb ≥ 30%). For the mechanical characterization, the Vickers hardness test provides an estimate of the material hardness (Hv) and the fracture toughness based on cracks that appear in the indentation. Mechanical impulse excitation test estimates the Young’s Modulus, shear modulus and Poisson ratio of the material. For the cutting forces, it a dynamometer was used to measure the forces in the face grinding process. The tests were made based on the Taguchi method to correlate the input parameters (feed rate, tool rotation speed and depth of cut) with the output parameters (surface roughness and cutting forces) to optimize the process (better roughness using the cutting forces that do not compromise the material structure and the tool life) using ANOVA. This study was conducted for conventional grinding and for the ultrasonic grinding process with the same cutting tools. It was possible to determine the optimum cutting parameters for minimum cutting forces and for minimum surface roughness in both grinding processes. Ultrasonic-assisted grinding provides a better surface roughness than conventional grinding.Keywords: CNC machining, crystal glass, cutting forces, hardness
Procedia PDF Downloads 153720 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network
Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang
Abstract:
As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.Keywords: GUI, deep learning, GAN, data augmentation
Procedia PDF Downloads 184719 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms
Authors: S. Nandagopalan, N. Pradeep
Abstract:
The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.Keywords: active contour, bayesian, echocardiographic image, feature vector
Procedia PDF Downloads 420718 Energy Efficient Clustering with Adaptive Particle Swarm Optimization
Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha
Abstract:
Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering
Procedia PDF Downloads 246717 Investigating Safe Operation Condition for Iterative Learning Control under Load Disturbances Effect in Singular Values
Authors: Muhammad A. Alsubaie
Abstract:
An iterative learning control framework designed in state feedback structure suffers a lack in investigating load disturbance considerations. The presented work discusses the controller previously designed, highlights the disturbance problem, finds new conditions using singular value principle to assure safe operation conditions with error convergence and reference tracking under the influence of load disturbance. It is known that periodic disturbances can be represented by a delay model in a positive feedback loop acting on the system input. This model can be manipulated by isolating the delay model and finding a controller for the overall system around the delay model to remedy the periodic disturbances using the small signal theorem. The overall system is the base for control design and load disturbance investigation. The major finding of this work is the load disturbance condition found which clearly sets safe operation condition under the influence of load disturbances such that the error tends to nearly zero as the system keeps operating trial after trial.Keywords: iterative learning control, singular values, state feedback, load disturbance
Procedia PDF Downloads 158716 Left to Right-Right Most Parsing Algorithm with Lookahead
Authors: Jamil Ahmed
Abstract:
Left to Right-Right Most (LR) parsing algorithm is a widely used algorithm of syntax analysis. It is contingent on a parsing table, whereas the parsing tables are extracted from the grammar. The parsing table specifies the actions to be taken during parsing. It requires that the parsing table should have no action conflicts for the same input symbol. This requirement imposes a condition on the class of grammars over which the LR algorithms work. However, there are grammars for which the parsing tables hold action conflicts. In such cases, the algorithm needs a capability of scanning (looking-ahead) next input symbols ahead of the current input symbol. In this paper, a ‘Left to Right’-‘Right Most’ parsing algorithm with lookahead capability is introduced. The 'look-ahead' capability in the LR parsing algorithm is the major contribution of this paper. The practicality of the proposed algorithm is substantiated by the parser implementation of the Context Free Grammar (CFG) of an already proposed programming language 'State Controlled Object Oriented Programming' (SCOOP). SCOOP’s Context Free Grammar has 125 productions and 192 item sets. This algorithm parses SCOOP while the grammar requires to ‘look ahead’ the input symbols due to action conflicts in its parsing table. Proposed LR parsing algorithm with lookahead capability can be viewed as an optimization of ‘Simple Left to Right’-‘Right Most’ (SLR) parsing algorithm.Keywords: left to right-right most parsing, syntax analysis, bottom-up parsing algorithm
Procedia PDF Downloads 126715 Effect of Threshold Corrections on Proton Lifetime and Emergence of Topological Defects in Grand Unified Theories
Authors: Rinku Maji, Joydeep Chakrabortty, Stephen F. King
Abstract:
The grand unified theory (GUT) rationales the arbitrariness of the standard model (SM) and explains many enigmas of nature at the outset of a single gauge group. The GUTs predict the proton decay and, the spontaneous symmetry breaking (SSB) of the higher symmetry group may lead to the formation of topological defects, which are indispensable in the context of the cosmological observations. The Super-Kamiokande (Super-K) experiment sets sacrosanct bounds on the partial lifetime (τ) of the proton decay for different channels, e.g., τ(p → e+ π0) > 1.6×10³⁴ years which is the most relevant channel to test the viability of the nonsupersymmetric GUTs. The GUTs based on the gauge groups SO(10) and E(6) are broken to the SM spontaneously through one and two intermediate gauge symmetries with the manifestation of the left-right symmetry at least at a single intermediate stage and the proton lifetime for these breaking chains has been computed. The impact of the threshold corrections, as a consequence of integrating out the heavy fields at the breaking scale alter the running of the gauge couplings, which eventually, are found to keep many GUTs off the Super-K bound. The possible topological defects arising in the course of SSB at different breaking scales for all breaking chains have been studied.Keywords: grand unified theories, proton decay, threshold correction, topological defects
Procedia PDF Downloads 175714 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement
Authors: Rhadinia Tayag-Relanes, Felina C. Young
Abstract:
This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.Keywords: continuous improvement, process, operations, PDCA
Procedia PDF Downloads 72713 Damage Identification Using Experimental Modal Analysis
Authors: Niladri Sekhar Barma, Satish Dhandole
Abstract:
Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification
Procedia PDF Downloads 116712 Theoretical Study of Acetylation of P-Methylaniline Catalyzed by Cu²⁺ Ions
Authors: Silvana Caglieri
Abstract:
Theoretical study of acetylation of p-methylaniline catalyzed by Cu2+ ions from the analysis of intermediate of the reaction was carried out. The study of acetylation of amines is of great interest by the utility of its products of reaction and is one of the most frequently used transformations in organic synthesis as it provides an efficient and inexpensive means for protecting amino groups in a multistep synthetic process. Acetylation of amine is a nucleophilic substitution reaction. This reaction can be catalyzed by Lewis acid, metallic ion. In reaction mechanism, the metallic ion formed a complex with the oxygen of the acetic anhydride carbonyl, facilitating the polarization of the same and the successive addition of amine at the position to form a tetrahedral intermediate, determining step of the rate of the reaction. Experimental work agreed that this reaction takes place with the formation of a tetrahedral intermediate. In the present theoretical work were investigated the structure and energy of the tetrahedral intermediate of the reaction catalyzed by Cu2+ ions. Geometries of all species involved in the acetylation were made and identified. All of the geometry optimizations were performed by the method at the DFT/B3LYP level of theory and the method MP2. Were adopted the 6-31+G* basis sets. Energies were calculated using the Mechanics-UFF method. Following the same procedure it was identified the geometric parameters and energy of reaction intermediate. The calculations show 61.35 kcal/mol of energy for the tetrahedral intermediate and the energy of activation for the reaction was 15.55 kcal/mol.Keywords: amides, amines, DFT, MP2
Procedia PDF Downloads 283711 Experiences of Online Opportunities and Risks: Examining Internet Use and Digital Literacy of Young People in Nigeria
Authors: Isah Yahaya Aliyu
Abstract:
Research on Internet use has often approached beneficial uses (online opportunities) of the Internet as separate from the risky encounters (online risks) of young people online. However, empirical evidence from diverse contexts appears to increasingly support the fusion of the two sets of online activities. Hence, the current research investigates the correlation between Internet use (IU) and digital literacy (DL) with online opportunities (OP) and risks (OR), using data from a Nigerian context, where there appears a paucity of research and literature on integrating opportunities and risks in the same study. A web-based data collection method was used to administer a survey to 335 undergraduate students in Northeastern Nigeria. Underpinned to Livingstone and Helsper model, findings are largely consistent with existing literature; IU and DL influence OP (R2 = 0.791, SE = 0.265, F-Stats = 626.566, P-value <.001), equally IU and DL influence OR as well (R2 = 0.343, SE = 0.465, F-Stats = 86.671, P-value <.001). OP and OR were found to strongly correlate positively (r = .667, n = 335, p < 0.01). This study has provided buttressing evidence from a Nigerian context of the fusion of benefits and risks of the Internet among young people. It has also upheld the argument for improved literacy as strategy for minimizing risks/harm rather than restricting use. Other theoretical and policy implications of the findings have been discussed in line with local and global debates about the Internet and its attendant effects.Keywords: digital, internet, literacy, opportunities, risks
Procedia PDF Downloads 87710 Examining the Concept of Sustainability in the Scenery Architecture of Naqsh-e-Jahan Square
Authors: Mahmood Naghizadeh, Maryam Memarian, Hourshad Irvash
Abstract:
Following the rise in the world population and the upward growth of urbanization, the design, planning, and management of the site scenery for the purpose of presentation and expansion of sustainable site scenery has turned to be the greatest concern to experts. Since the fundamental principles of the site scenery change more and less haphazardly over time, sustainable site scenery can be viewed as an ideal goal because both sustainability and dynamism come into view in urban site scenery and it wouldn’t be designed according to a set of pre-determined principles. Sustainable site scenery, as the ongoing interaction between idealism and pragmatism with sustainability factors, is a dynamic phenomenon created by bringing cultural, historical, social and natural scenery together. Such an interaction is not to subdue other factors but to reinforce the aforementioned factors. The sustainable site scenery is a persistently occurring event not only has attenuated over time but has gained strength. The sustainability of a site scenery or an event over time depends on its site identity which grows out of its continuous association with the past. The sustainability of a site scene or an event in a time frame intertwined with the identity of the place from past to present. This past history supports the present and future of the scene. The result of such a supportive role is the sustainability of site scenery. Isfahan Naqsh-e-Jahan Square is one of the most outstanding squares in the world and the best embodiment of Iranian site scenery architecture. This square is an arena that brings people together and a dynamic city center comprising various urban and religious complexes, spaces and facilities and is considered as one of the most favorable traditional urban space of Iran. Such a place can illustrate many factors related to sustainable site scenery. One the other hand, there are still no specific principles concerning sustainability in the architecture of site scenery. Meanwhile, sustainability is recognized as a rather modern view in architecture. The purpose of this research is to identify factors involved in sustainability in general and to examine their effects on site scenery architecture in particular. Finally, these factors will be studied with taking Naqsh-e-Jahan Square into account. This research adopts an analytic-descriptive approach that has benefited from the review of literature available in library studies and the documents related to sustainability and site scenery architecture. The statistical population used for the purpose of this research includes square constructed during the Safavid dynasty and Naqsh-e-Jahan Square was picked out as the case study. The purpose of this paper is to come up with a rough definition of sustainable site scenery and demonstrate this concept by analyzing it and recognizing the social, economic and ecological aspects of this project.Keywords: Naqsh-e-Jahan Square, site scenery architecture, sustainability, sustainable site scenery
Procedia PDF Downloads 312