Search results for: type-2 fuzzy sets
760 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images
Authors: S. Nandagopalan, N. Pradeep
Abstract:
The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.Keywords: active contour, Bayesian, echocardiographic image, feature vector
Procedia PDF Downloads 446759 Influence of Season, Temperature, and Photoperiod on Growth of the Land Snail Helix aperta
Authors: S. Benbellil-Tafoughalt, J. M. Koene
Abstract:
Growth strategies are often plastic and influenced by environmental conditions. Terrestrial gastropods are particularly affected by seasonal and climatic variables, and growth rate and size at maturity are key traits in their life history. Therefore, we investigated juvenile growth of Helix aperta snails under four combinations of temperature and photoperiod using two sets of young snails, born in the laboratory from adults collected in either the autumn (aestivating snails) or spring (active snails). Parental snails were collected from Bakaro (Northeastern Algeria). Higher temperature increased adult size and reduced time to reproduction. Long day photoperiod also increased the final body weight, but had no effect on the length of the growth period. The season of birth had significant effects on length of the growth period and weight of hatchings, whereas this weight difference disappeared by adulthood. The spring snails took less time to develop and reached similar adult body weight as the autumn snails. These differences may be due to differences in egg size or quality between the snails from different seasons. More rapid growth in spring snails results in larger snails entering aestivation, a period with size-related mortality in this species.Keywords: growth, Hélix aperta, photoperiod, temperature
Procedia PDF Downloads 338758 Tracking Filtering Algorithm Based on ConvLSTM
Authors: Ailing Yang, Penghan Song, Aihua Cai
Abstract:
The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention
Procedia PDF Downloads 180757 Flood-Induced River Disruption: Geomorphic Imprints and Topographic Effects in Kelantan River Catchment from Kemubu to Kuala Besar, Kelantan, Malaysia
Authors: Mohamad Muqtada Ali Khan, Nor Ashikin Shaari, Donny Adriansyah bin Nazaruddin, Hafzan Eva Bt Mansoor
Abstract:
Floods play a key role in landform evolution of an area. This process is likely to alter the topography of the earth’s surface. The present study area, Kota Bharu is very prone to floods extends from upstream of Kelantan River near Kemubu to the downstream area near Kuala Besar. These flood events which occur every year in the study area exhibit a strong bearing on river morphological set-up. In the present study, three satellite imageries of different time periods have been used to manifest the post-flood landform changes. The pre-processing of the images such as subset, geometric corrections and atmospheric corrections were carried-out using ENVI 4.5 followed by the analysis processes. Twenty sets of cross sections were plotted using software Erdas 9.2, ERDAS and ArcGis 10 for the all three images. The results show a significant change in the length of the cross section which suggest that the geomorphological processes play a key role in carving and shaping the river banks during the floods.Keywords: flood induced, geomorphic imprints, Kelantan river, Malaysia
Procedia PDF Downloads 545756 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks
Authors: Radhika Ranjan Roy
Abstract:
Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve
Procedia PDF Downloads 79755 Changing Roles for Academic Leaders: A Comparative Study between Sweden and South Africa
Authors: Åse Nygren, Linda du Plessis
Abstract:
Academic leadership has traditionally been associated with collegiality, consensus and a limitation in time. These factors alone have resulted in a complex and fuzzy leadership culture in academia, combined with a strong sense of autonomy among researchers and teachers. A more competitive educational market have resulted in increased audit as well as recent autonomy reforms with higher demands on effectiveness, cost awareness and accountability in higher education. In recent years, with the introduction of new public management, academic leadership has been in a state of transition moving from collegiality towards manergerialism. University reforms and changes, which have gradually taken place in most western countries in the past decade, including Sweden and South-Africa, have contributed to the notion that collegial academic leadership is questioned. Academic leadership is traditionally associated with vice-chancellors, deans and heads of departments. This paper will focus on “outer circle” of academic leaders, consisting of, for example, program directors, directors of disciplines, course coordinators and research leaders. We investigate the meaning of collegiality for these groups of academic leaders in Sweden and South-Africa. The paper rests on a comparative study made on universities both in Sweden and in South-Africa. The aim of the comparison is to achieve a wider scope and to investigate perspectives from both inside and outside of Bologna.Keywords: academic leadership, new public management, collegiality, consensus
Procedia PDF Downloads 416754 Wage Differentiation Patterns of Households Revisited for Turkey in Same Industry Employment: A Pseudo-Panel Approach
Authors: Yasin Kutuk, Bengi Yanik Ilhan
Abstract:
Previous studies investigate the wage differentiations among regions in Turkey between couples who work in the same industry and those who work in different industries by using the models that is appropriate for cross sectional data. However, since there is no available panel data for this investigation in Turkey, pseudo panels using repeated cross-section data sets of the Household Labor Force Surveys 2004-2014 are employed in order to open a new way to examine wage differentiation patterns. For this purpose, household heads are separated into groups with respect to their household composition. These groups’ membership is assumed to be fixed over time such as age groups, education, gender, and NUTS1 (12 regions) Level. The average behavior of them can be tracked overtime same as in the panel data. Estimates using the pseudo panel data would be consistent with the estimates using genuine panel data on individuals if samples are representative of the population which has fixed composition, characteristics. With controlling the socioeconomic factors, wage differentiation of household income is affected by social, cultural and economic changes after global economic crisis emerged in US. It is also revealed whether wage differentiation is changing among the birth cohorts.Keywords: wage income, same industry, pseudo panel, panel data econometrics
Procedia PDF Downloads 399753 Formation of Academia-Industry Collaborative Model to Improve the Quality of Teaching-Learning Process
Authors: M. Dakshayini, P. Jayarekha
Abstract:
In traditional output-based education system, class room lecture and laboratory are the traditional delivery methods used during the course. Written examination and lab examination have been used as a conventional tool for evaluating student’s performance. Hence, there are certain apprehensions that the traditional education system may not efficiently prepare the students for competent professional life. This has led for the change from Traditional output-based education to Outcome-Based Education (OBE). OBE first sets the ideal programme learning outcome consecutively on increasing degree of complexity that students are expected to master. The core curriculum, teaching methodologies and assessment tools are then designed to achieve the proposed outcomes mainly focusing on what students can actually attain after they are taught. In this paper, we discuss a promising applications based learning and evaluation component involving industry collaboration to improve the quality of teaching and student learning process. Incorporation of this component definitely improves the quality of student learning in engineering education and helps the student to attain the competency as per the graduate attributes. This may also reduce the Industry-academia gap.Keywords: outcome-based education, programme learning outcome, teaching-learning process, evaluation, industry collaboration
Procedia PDF Downloads 449752 Research on Optimization Strategies for the Negative Space of Urban Rail Transit Based on Urban Public Art Planning
Authors: Kexin Chen
Abstract:
As an important method of transportation to solve the demand and supply contradiction generated in the rapid urbanization process, urban rail traffic system has been rapidly developed over the past ten years in China. During the rapid development, the space of urban rail Transit has encountered many problems, such as space simplification, sensory experience dullness, and poor regional identification, etc. This paper, focus on the study of the negative space of subway station and spatial softening, by comparing and learning from foreign cases. The article sorts out cases at home and abroad, make a comparative study of the cases, analysis more diversified setting of public art, and sets forth propositions on the domestic type of public art in the space of urban rail transit for reference, then shows the relationship of the spatial attribute in the space of urban rail transit and public art form. In this foundation, it aims to characterize more diverse setting ways for public art; then suggests the three public art forms corresponding properties, such as static presenting mode, dynamic image mode, and spatial softening mode; finds out the method of urban public art to optimize negative space.Keywords: diversification, negative space, optimization strategy, public art planning
Procedia PDF Downloads 208751 The Use of Rule-Based Cellular Automata to Track and Forecast the Dispersal of Classical Biocontrol Agents at Scale, with an Application to the Fopius arisanus Fruit Fly Parasitoid
Authors: Agboka Komi Mensah, John Odindi, Elfatih M. Abdel-Rahman, Onisimo Mutanga, Henri Ez Tonnang
Abstract:
Ecosystems are networks of organisms and populations that form a community of various species interacting within their habitats. Such habitats are defined by abiotic and biotic conditions that establish the initial limits to a population's growth, development, and reproduction. The habitat’s conditions explain the context in which species interact to access resources such as food, water, space, shelter, and mates, allowing for feeding, dispersal, and reproduction. Dispersal is an essential life-history strategy that affects gene flow, resource competition, population dynamics, and species distributions. Despite the importance of dispersal in population dynamics and survival, understanding the mechanism underpinning the dispersal of organisms remains challenging. For instance, when an organism moves into an ecosystem for survival and resource competition, its progression is highly influenced by extrinsic factors such as its physiological state, climatic variables and ability to evade predation. Therefore, greater spatial detail is necessary to understand organism dispersal dynamics. Understanding organisms dispersal can be addressed using empirical and mechanistic modelling approaches, with the adopted approach depending on the study's purpose Cellular automata (CA) is an example of these approaches that have been successfully used in biological studies to analyze the dispersal of living organisms. Cellular automata can be briefly described as occupied cells by an individual that evolves based on proper decisions based on a set of neighbours' rules. However, in the ambit of modelling individual organisms dispersal at the landscape scale, we lack user friendly tools that do not require expertise in mathematical models and computing ability; such as a visual analytics framework for tracking and forecasting the dispersal behaviour of organisms. The term "visual analytics" (VA) describes a semiautomated approach to electronic data processing that is guided by users who can interact with data via an interface. Essentially, VA converts large amounts of quantitative or qualitative data into graphical formats that can be customized based on the operator's needs. Additionally, this approach can be used to enhance the ability of users from various backgrounds to understand data, communicate results, and disseminate information across a wide range of disciplines. To support effective analysis of the dispersal of organisms at the landscape scale, we therefore designed Pydisp which is a free visual data analytics tool for spatiotemporal dispersal modeling built in Python. Its user interface allows users to perform a quick and interactive spatiotemporal analysis of species dispersal using bioecological and climatic data. Pydisp enables reuse and upgrade through the use of simple principles such as Fuzzy cellular automata algorithms. The potential of dispersal modeling is demonstrated in a case study by predicting the dispersal of Fopius arisanus (Sonan), endoparasitoids to control Bactrocera dorsalis (Hendel) (Diptera: Tephritidae) in Kenya. The results obtained from our example clearly illustrate the parasitoid's dispersal process at the landscape level and confirm that dynamic processes in an agroecosystem are better understood when designed using mechanistic modelling approaches. Furthermore, as demonstrated in the example, the built software is highly effective in portraying the dispersal of organisms despite the unavailability of detailed data on the species dispersal mechanisms.Keywords: cellular automata, fuzzy logic, landscape, spatiotemporal
Procedia PDF Downloads 79750 Global City Typologies: 300 Cities and Over 100 Datasets
Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans
Abstract:
Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling
Procedia PDF Downloads 182749 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea
Authors: Pavel Shcherban, Vlad Golovanov
Abstract:
Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography
Procedia PDF Downloads 157748 Analyzing Extended Reality Technologies for Human Space Exploration
Authors: Morgan Kuligowski, Marientina Gotsis
Abstract:
Extended reality (XR) technologies share an intertwined history with spaceflight and innovation. New advancements in XR technologies offer expanding possibilities to advance the future of human space exploration with increased crew autonomy. This paper seeks to identify implementation gaps between existing and proposed XR space applications to inform future mission planning. A review of virtual reality, augmented reality, and mixed reality technologies implemented aboard the International Space Station revealed a total of 16 flown investigations. A secondary set of ground-tested XR human spaceflight applications were systematically retrieved from literature sources. The two sets of XR technologies, those flown and those existing in the literature were analyzed to characterize application domains and device types. Comparisons between these groups revealed untapped application areas for XR to support crew psychological health, in-flight training, and extravehicular operations on future flights. To fill these roles, integrating XR technologies with advancements in biometric sensors and machine learning tools is expected to transform crew capabilities.Keywords: augmented reality, extended reality, international space station, mixed reality, virtual reality
Procedia PDF Downloads 216747 Minimum-Fuel Optimal Trajectory for Reusable First-Stage Rocket Landing Using Particle Swarm Optimization
Authors: Kevin Spencer G. Anglim, Zhenyu Zhang, Qingbin Gao
Abstract:
Reusable launch vehicles (RLVs) present a more environmentally-friendly approach to accessing space when compared to traditional launch vehicles that are discarded after each flight. This paper studies the recyclable nature of RLVs by presenting a solution method for determining minimum-fuel optimal trajectories using principles from optimal control theory and particle swarm optimization (PSO). This problem is formulated as a minimum-landing error powered descent problem where it is desired to move the RLV from a fixed set of initial conditions to three different sets of terminal conditions. However, unlike other powered descent studies, this paper considers the highly nonlinear effects caused by atmospheric drag, which are often ignored for studies on the Moon or on Mars. Rather than optimizing the controls directly, the throttle control is assumed to be bang-off-bang with a predetermined thrust direction for each phase of flight. The PSO method is verified in a one-dimensional comparison study, and it is then applied to the two-dimensional cases, the results of which are illustrated.Keywords: minimum-fuel optimal trajectory, particle swarm optimization, reusable rocket, SpaceX
Procedia PDF Downloads 278746 Heuristic Classification of Hydrophone Recordings
Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas
Abstract:
An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.Keywords: anthrophony, hydrophone, k-means, machine learning
Procedia PDF Downloads 170745 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network
Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang
Abstract:
As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.Keywords: GUI, deep learning, GAN, data augmentation
Procedia PDF Downloads 185744 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms
Authors: S. Nandagopalan, N. Pradeep
Abstract:
The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.Keywords: active contour, bayesian, echocardiographic image, feature vector
Procedia PDF Downloads 421743 Energy Efficient Clustering with Adaptive Particle Swarm Optimization
Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha
Abstract:
Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering
Procedia PDF Downloads 249742 Investigating Safe Operation Condition for Iterative Learning Control under Load Disturbances Effect in Singular Values
Authors: Muhammad A. Alsubaie
Abstract:
An iterative learning control framework designed in state feedback structure suffers a lack in investigating load disturbance considerations. The presented work discusses the controller previously designed, highlights the disturbance problem, finds new conditions using singular value principle to assure safe operation conditions with error convergence and reference tracking under the influence of load disturbance. It is known that periodic disturbances can be represented by a delay model in a positive feedback loop acting on the system input. This model can be manipulated by isolating the delay model and finding a controller for the overall system around the delay model to remedy the periodic disturbances using the small signal theorem. The overall system is the base for control design and load disturbance investigation. The major finding of this work is the load disturbance condition found which clearly sets safe operation condition under the influence of load disturbances such that the error tends to nearly zero as the system keeps operating trial after trial.Keywords: iterative learning control, singular values, state feedback, load disturbance
Procedia PDF Downloads 158741 Left to Right-Right Most Parsing Algorithm with Lookahead
Authors: Jamil Ahmed
Abstract:
Left to Right-Right Most (LR) parsing algorithm is a widely used algorithm of syntax analysis. It is contingent on a parsing table, whereas the parsing tables are extracted from the grammar. The parsing table specifies the actions to be taken during parsing. It requires that the parsing table should have no action conflicts for the same input symbol. This requirement imposes a condition on the class of grammars over which the LR algorithms work. However, there are grammars for which the parsing tables hold action conflicts. In such cases, the algorithm needs a capability of scanning (looking-ahead) next input symbols ahead of the current input symbol. In this paper, a ‘Left to Right’-‘Right Most’ parsing algorithm with lookahead capability is introduced. The 'look-ahead' capability in the LR parsing algorithm is the major contribution of this paper. The practicality of the proposed algorithm is substantiated by the parser implementation of the Context Free Grammar (CFG) of an already proposed programming language 'State Controlled Object Oriented Programming' (SCOOP). SCOOP’s Context Free Grammar has 125 productions and 192 item sets. This algorithm parses SCOOP while the grammar requires to ‘look ahead’ the input symbols due to action conflicts in its parsing table. Proposed LR parsing algorithm with lookahead capability can be viewed as an optimization of ‘Simple Left to Right’-‘Right Most’ (SLR) parsing algorithm.Keywords: left to right-right most parsing, syntax analysis, bottom-up parsing algorithm
Procedia PDF Downloads 126740 Effect of Threshold Corrections on Proton Lifetime and Emergence of Topological Defects in Grand Unified Theories
Authors: Rinku Maji, Joydeep Chakrabortty, Stephen F. King
Abstract:
The grand unified theory (GUT) rationales the arbitrariness of the standard model (SM) and explains many enigmas of nature at the outset of a single gauge group. The GUTs predict the proton decay and, the spontaneous symmetry breaking (SSB) of the higher symmetry group may lead to the formation of topological defects, which are indispensable in the context of the cosmological observations. The Super-Kamiokande (Super-K) experiment sets sacrosanct bounds on the partial lifetime (τ) of the proton decay for different channels, e.g., τ(p → e+ π0) > 1.6×10³⁴ years which is the most relevant channel to test the viability of the nonsupersymmetric GUTs. The GUTs based on the gauge groups SO(10) and E(6) are broken to the SM spontaneously through one and two intermediate gauge symmetries with the manifestation of the left-right symmetry at least at a single intermediate stage and the proton lifetime for these breaking chains has been computed. The impact of the threshold corrections, as a consequence of integrating out the heavy fields at the breaking scale alter the running of the gauge couplings, which eventually, are found to keep many GUTs off the Super-K bound. The possible topological defects arising in the course of SSB at different breaking scales for all breaking chains have been studied.Keywords: grand unified theories, proton decay, threshold correction, topological defects
Procedia PDF Downloads 177739 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement
Authors: Rhadinia Tayag-Relanes, Felina C. Young
Abstract:
This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.Keywords: continuous improvement, process, operations, PDCA
Procedia PDF Downloads 75738 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation
Procedia PDF Downloads 326737 Damage Identification Using Experimental Modal Analysis
Authors: Niladri Sekhar Barma, Satish Dhandole
Abstract:
Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification
Procedia PDF Downloads 117736 Soft Computing Approach for Diagnosis of Lassa Fever
Authors: Roseline Oghogho Osaseri, Osaseri E. I.
Abstract:
Lassa fever is an epidemic hemorrhagic fever caused by the Lassa virus, an extremely virulent arena virus. This highly fatal disorder kills 10% to 50% of its victims, but those who survive its early stages usually recover and acquire immunity to secondary attacks. One of the major challenges in giving proper treatment is lack of fast and accurate diagnosis of the disease due to multiplicity of symptoms associated with the disease which could be similar to other clinical conditions and makes it difficult to diagnose early. This paper proposed an Adaptive Neuro Fuzzy Inference System (ANFIS) for the prediction of Lass Fever. In the design of the diagnostic system, four main attributes were considered as the input parameters and one output parameter for the system. The input parameters are Temperature on admission (TA), White Blood Count (WBC), Proteinuria (P) and Abdominal Pain (AP). Sixty-one percent of the datasets were used in training the system while fifty-nine used in testing. Experimental results from this study gave a reliable and accurate prediction of Lassa fever when compared with clinically confirmed cases. In this study, we have proposed Lassa fever diagnostic system to aid surgeons and medical healthcare practictionals in health care facilities who do not have ready access to Polymerase Chain Reaction (PCR) diagnosis to predict possible Lassa fever infection.Keywords: anfis, lassa fever, medical diagnosis, soft computing
Procedia PDF Downloads 271735 A New Optimization Algorithm for Operation of a Microgrid
Authors: Sirus Mohammadi, Rohala Moghimi
Abstract:
The main advantages of microgrids are high energy efficiency through the application of Combined Heat and Power (CHP), high quality and reliability of the delivered electric energy and environmental and economic advantages. This study presents an energy management system (EMS) to optimize the operation of the microgrid (MG). In this paper an Adaptive Modified Firefly Algorithm (AMFA) is presented for optimal operation of a typical MG with renewable energy sources (RESs) accompanied by a back-up Micro-Turbine/Fuel Cell/Battery hybrid power source to level the power mismatch or to store the energy surplus when it’s needed. The problem is formulated as a nonlinear constraint problem to minimize the total operating cost. The management of Energy storage system (ESS), economic load dispatch and operation optimization of distributed generation (DG) are simplified into a single-object optimization problem in the EMS. The proposed algorithm is tested on a typical grid-connected MG including WT/PV/Micro Turbine/Fuel Cell and Energy Storage Devices (ESDs) then its superior performance is compared with those from other evolutionary algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Fuzzy Self Adaptive PSO (FSAPSO), Chaotic Particle PSO (CPSO), Adaptive Modified PSO (AMPSO), and Firefly Algorithm (FA).Keywords: microgrid, operation management, optimization, firefly algorithm (AMFA)
Procedia PDF Downloads 341734 Theoretical Study of Acetylation of P-Methylaniline Catalyzed by Cu²⁺ Ions
Authors: Silvana Caglieri
Abstract:
Theoretical study of acetylation of p-methylaniline catalyzed by Cu2+ ions from the analysis of intermediate of the reaction was carried out. The study of acetylation of amines is of great interest by the utility of its products of reaction and is one of the most frequently used transformations in organic synthesis as it provides an efficient and inexpensive means for protecting amino groups in a multistep synthetic process. Acetylation of amine is a nucleophilic substitution reaction. This reaction can be catalyzed by Lewis acid, metallic ion. In reaction mechanism, the metallic ion formed a complex with the oxygen of the acetic anhydride carbonyl, facilitating the polarization of the same and the successive addition of amine at the position to form a tetrahedral intermediate, determining step of the rate of the reaction. Experimental work agreed that this reaction takes place with the formation of a tetrahedral intermediate. In the present theoretical work were investigated the structure and energy of the tetrahedral intermediate of the reaction catalyzed by Cu2+ ions. Geometries of all species involved in the acetylation were made and identified. All of the geometry optimizations were performed by the method at the DFT/B3LYP level of theory and the method MP2. Were adopted the 6-31+G* basis sets. Energies were calculated using the Mechanics-UFF method. Following the same procedure it was identified the geometric parameters and energy of reaction intermediate. The calculations show 61.35 kcal/mol of energy for the tetrahedral intermediate and the energy of activation for the reaction was 15.55 kcal/mol.Keywords: amides, amines, DFT, MP2
Procedia PDF Downloads 285733 Experiences of Online Opportunities and Risks: Examining Internet Use and Digital Literacy of Young People in Nigeria
Authors: Isah Yahaya Aliyu
Abstract:
Research on Internet use has often approached beneficial uses (online opportunities) of the Internet as separate from the risky encounters (online risks) of young people online. However, empirical evidence from diverse contexts appears to increasingly support the fusion of the two sets of online activities. Hence, the current research investigates the correlation between Internet use (IU) and digital literacy (DL) with online opportunities (OP) and risks (OR), using data from a Nigerian context, where there appears a paucity of research and literature on integrating opportunities and risks in the same study. A web-based data collection method was used to administer a survey to 335 undergraduate students in Northeastern Nigeria. Underpinned to Livingstone and Helsper model, findings are largely consistent with existing literature; IU and DL influence OP (R2 = 0.791, SE = 0.265, F-Stats = 626.566, P-value <.001), equally IU and DL influence OR as well (R2 = 0.343, SE = 0.465, F-Stats = 86.671, P-value <.001). OP and OR were found to strongly correlate positively (r = .667, n = 335, p < 0.01). This study has provided buttressing evidence from a Nigerian context of the fusion of benefits and risks of the Internet among young people. It has also upheld the argument for improved literacy as strategy for minimizing risks/harm rather than restricting use. Other theoretical and policy implications of the findings have been discussed in line with local and global debates about the Internet and its attendant effects.Keywords: digital, internet, literacy, opportunities, risks
Procedia PDF Downloads 88732 The Impact of Artificial Intelligence on Spare Parts Technology
Authors: Amir Andria Gad Shehata
Abstract:
Minimizing the inventory cost, optimizing the inventory quantities, and increasing system operational availability are the main motivations to enhance forecasting demand of spare parts in a major power utility company in Medina. This paper reports in an effort made to optimize the orders quantities of spare parts by improving the method of forecasting the demand. The study focuses on equipment that has frequent spare parts purchase orders with uncertain demand. The pattern of the demand considers a lumpy pattern which makes conventional forecasting methods less effective. A comparison was made by benchmarking various methods of forecasting based on experts’ criteria to select the most suitable method for the case study. Three actual data sets were used to make the forecast in this case study. Two neural networks (NN) approaches were utilized and compared, namely long short-term memory (LSTM) and multilayer perceptron (MLP). The results as expected, showed that the NN models gave better results than traditional forecasting method (judgmental method). In addition, the LSTM model had a higher predictive accuracy than the MLP model.Keywords: spare part, spare part inventory, inventory model, optimization, maintenanceneural network, LSTM, MLP, forecasting demand, inventory management
Procedia PDF Downloads 65731 Pharmacophore-Based Modeling of a Series of Human Glutaminyl Cyclase Inhibitors to Identify Lead Molecules by Virtual Screening, Molecular Docking and Molecular Dynamics Simulation Study
Authors: Ankur Chaudhuri, Sibani Sen Chakraborty
Abstract:
In human, glutaminyl cyclase activity is highly abundant in neuronal and secretory tissues and is preferentially restricted to hypothalamus and pituitary. The N-terminal modification of β-amyloids (Aβs) peptides by the generation of a pyro-glutamyl (pGlu) modified Aβs (pE-Aβs) is an important process in the initiation of the formation of neurotoxic plaques in Alzheimer’s disease (AD). This process is catalyzed by glutaminyl cyclase (QC). The expression of QC is characteristically up-regulated in the early stage of AD, and the hallmark of the inhibition of QC is the prevention of the formation of pE-Aβs and plaques. A computer-aided drug design (CADD) process was employed to give an idea for the designing of potentially active compounds to understand the inhibitory potency against human glutaminyl cyclase (QC). This work elaborates the ligand-based and structure-based pharmacophore exploration of glutaminyl cyclase (QC) by using the known inhibitors. Three dimensional (3D) quantitative structure-activity relationship (QSAR) methods were applied to 154 compounds with known IC50 values. All the inhibitors were divided into two sets, training-set, and test-sets. Generally, training-set was used to build the quantitative pharmacophore model based on the principle of structural diversity, whereas the test-set was employed to evaluate the predictive ability of the pharmacophore hypotheses. A chemical feature-based pharmacophore model was generated from the known 92 training-set compounds by HypoGen module implemented in Discovery Studio 2017 R2 software package. The best hypothesis was selected (Hypo1) based upon the highest correlation coefficient (0.8906), lowest total cost (463.72), and the lowest root mean square deviation (2.24Å) values. The highest correlation coefficient value indicates greater predictive activity of the hypothesis, whereas the lower root mean square deviation signifies a small deviation of experimental activity from the predicted one. The best pharmacophore model (Hypo1) of the candidate inhibitors predicted comprised four features: two hydrogen bond acceptor, one hydrogen bond donor, and one hydrophobic feature. The Hypo1 was validated by several parameters such as test set activity prediction, cost analysis, Fischer's randomization test, leave-one-out method, and heat map of ligand profiler. The predicted features were then used for virtual screening of potential compounds from NCI, ASINEX, Maybridge and Chembridge databases. More than seven million compounds were used for this purpose. The hit compounds were filtered by drug-likeness and pharmacokinetics properties. The selective hits were docked to the high-resolution three-dimensional structure of the target protein glutaminyl cyclase (PDB ID: 2AFU/2AFW) to filter these hits further. To validate the molecular docking results, the most active compound from the dataset was selected as a reference molecule. From the density functional theory (DFT) study, ten molecules were selected based on their highest HOMO (highest occupied molecular orbitals) energy and the lowest bandgap values. Molecular dynamics simulations with explicit solvation systems of the final ten hit compounds revealed that a large number of non-covalent interactions were formed with the binding site of the human glutaminyl cyclase. It was suggested that the hit compounds reported in this study could help in future designing of potent inhibitors as leads against human glutaminyl cyclase.Keywords: glutaminyl cyclase, hit lead, pharmacophore model, simulation
Procedia PDF Downloads 131