Search results for: linear congruential algorithm
4598 Orbit Determination from Two Position Vectors Using Finite Difference Method
Authors: Akhilesh Kumar, Sathyanarayan G., Nirmala S.
Abstract:
An unusual approach is developed to determine the orbit of satellites/space objects. The determination of orbits is considered a boundary value problem and has been solved using the finite difference method (FDM). Only positions of the satellites/space objects are known at two end times taken as boundary conditions. The technique of finite difference has been used to calculate the orbit between end times. In this approach, the governing equation is defined as the satellite's equation of motion with a perturbed acceleration. Using the finite difference method, the governing equations and boundary conditions are discretized. The resulting system of algebraic equations is solved using Tri Diagonal Matrix Algorithm (TDMA) until convergence is achieved. This methodology test and evaluation has been done using all GPS satellite orbits from National Geospatial-Intelligence Agency (NGA) precise product for Doy 125, 2023. Towards this, two hours of twelve sets have been taken into consideration. Only positions at the end times of each twelve sets are considered boundary conditions. This algorithm is applied to all GPS satellites. Results achieved using FDM compared with the results of NGA precise orbits. The maximum RSS error for the position is 0.48 [m] and the velocity is 0.43 [mm/sec]. Also, the present algorithm is applied on the IRNSS satellites for Doy 220, 2023. The maximum RSS error for the position is 0.49 [m], and for velocity is 0.28 [mm/sec]. Next, a simulation has been done for a Highly Elliptical orbit for DOY 63, 2023, for the duration of 6 hours. The RSS of difference in position is 0.92 [m] and velocity is 1.58 [mm/sec] for the orbital speed of more than 5km/sec. Whereas the RSS of difference in position is 0.13 [m] and velocity is 0.12 [mm/sec] for the orbital speed less than 5km/sec. Results show that the newly created method is reliable and accurate. Further applications of the developed methodology include missile and spacecraft targeting, orbit design (mission planning), space rendezvous and interception, space debris correlation, and navigation solutions.Keywords: finite difference method, grid generation, NavIC system, orbit perturbation
Procedia PDF Downloads 854597 Cooperative Sensing for Wireless Sensor Networks
Authors: Julien Romieux, Fabio Verdicchio
Abstract:
Wireless Sensor Networks (WSNs), which sense environmental data with battery-powered nodes, require multi-hop communication. This power-demanding task adds an extra workload that is unfairly distributed across the network. As a result, nodes run out of battery at different times: this requires an impractical individual node maintenance scheme. Therefore we investigate a new Cooperative Sensing approach that extends the WSN operational life and allows a more practical network maintenance scheme (where all nodes deplete their batteries almost at the same time). We propose a novel cooperative algorithm that derives a piecewise representation of the sensed signal while controlling approximation accuracy. Simulations show that our algorithm increases WSN operational life and spreads communication workload evenly. Results convey a counterintuitive conclusion: distributing workload fairly amongst nodes may not decrease the network power consumption and yet extend the WSN operational life. This is achieved as our cooperative approach decreases the workload of the most burdened cluster in the network.Keywords: cooperative signal processing, signal representation and approximation, power management, wireless sensor networks
Procedia PDF Downloads 3904596 Path Planning for Multiple Unmanned Aerial Vehicles Based on Adaptive Probabilistic Sampling Algorithm
Authors: Long Cheng, Tong He, Iraj Mantegh, Wen-Fang Xie
Abstract:
Path planning is essential for UAVs (Unmanned Aerial Vehicle) with autonomous navigation in unknown environments. In this paper, an adaptive probabilistic sampling algorithm is proposed for the GPS-denied environment, which can be utilized for autonomous navigation system of multiple UAVs in a dynamically-changing structured environment. This method can be used for Unmanned Aircraft Systems Traffic Management (UTM) solutions and in autonomous urban aerial mobility, where a number of platforms are expected to share the airspace. A path network is initially built off line based on available environment map, and on-board sensors systems on the flying UAVs are used for continuous situational awareness and to inform the changes in the path network. Simulation results based on MATLAB and Gazebo in different scenarios and algorithms performance measurement show the high efficiency and accuracy of the proposed technique in unknown environments.Keywords: path planning, adaptive probabilistic sampling, obstacle avoidance, multiple unmanned aerial vehicles, unknown environments
Procedia PDF Downloads 1564595 Study for an Optimal Cable Connection within an Inner Grid of an Offshore Wind Farm
Authors: Je-Seok Shin, Wook-Won Kim, Jin-O Kim
Abstract:
The offshore wind farm needs to be designed carefully considering economics and reliability aspects. There are many decision-making problems for designing entire offshore wind farm, this paper focuses on an inner grid layout which means the connection between wind turbines as well as between wind turbines and an offshore substation. A methodology proposed in this paper determines the connections and the cable type for each connection section using K-clustering, minimum spanning tree and cable selection algorithms. And then, a cost evaluation is performed in terms of investment, power loss and reliability. Through the cost evaluation, an optimal layout of inner grid is determined so as to have the lowest total cost. In order to demonstrate the validity of the methodology, the case study is conducted on 240MW offshore wind farm, and the results show that it is helpful to design optimally offshore wind farm.Keywords: offshore wind farm, optimal layout, k-clustering algorithm, minimum spanning algorithm, cable type selection, power loss cost, reliability cost
Procedia PDF Downloads 3854594 Classifying Affective States in Virtual Reality Environments Using Physiological Signals
Authors: Apostolos Kalatzis, Ashish Teotia, Vishnunarayan Girishan Prabhu, Laura Stanley
Abstract:
Emotions are functional behaviors influenced by thoughts, stimuli, and other factors that induce neurophysiological changes in the human body. Understanding and classifying emotions are challenging as individuals have varying perceptions of their environments. Therefore, it is crucial that there are publicly available databases and virtual reality (VR) based environments that have been scientifically validated for assessing emotional classification. This study utilized two commercially available VR applications (Guided Meditation VR™ and Richie’s Plank Experience™) to induce acute stress and calm state among participants. Subjective and objective measures were collected to create a validated multimodal dataset and classification scheme for affective state classification. Participants’ subjective measures included the use of the Self-Assessment Manikin, emotional cards and 9 point Visual Analogue Scale for perceived stress, collected using a Virtual Reality Assessment Tool developed by our team. Participants’ objective measures included Electrocardiogram and Respiration data that were collected from 25 participants (15 M, 10 F, Mean = 22.28 4.92). The features extracted from these data included heart rate variability components and respiration rate, both of which were used to train two machine learning models. Subjective responses validated the efficacy of the VR applications in eliciting the two desired affective states; for classifying the affective states, a logistic regression (LR) and a support vector machine (SVM) with a linear kernel algorithm were developed. The LR outperformed the SVM and achieved 93.8%, 96.2%, 93.8% leave one subject out cross-validation accuracy, precision and recall, respectively. The VR assessment tool and data collected in this study are publicly available for other researchers.Keywords: affective computing, biosignals, machine learning, stress database
Procedia PDF Downloads 1424593 Magnetohydrodynamics Flow and Heat Transfer in a Non-Newtonian Power-Law Fluid due to a Rotating Disk with Velocity Slip and Temperature Jump
Authors: Nur Dayana Khairunnisa Rosli, Seripah Awang Kechil
Abstract:
Swirling flows with velocity slip are important in nature and industrial processes. The present work considers the effects of velocity slip, temperature jump and suction/injection on the flow and heat transfer of power-law fluids due to a rotating disk in the presence of magnetic field. The system of the partial differential equations is highly non-linear. The number of independent variables is reduced by transforming the system into a system of coupled non-linear ordinary differential equations using similarity transformations. The effects of suction/injection, velocity slip and temperature jump on the flow rates are investigated for various cases of shear thinning and shear thickening power law fluids. The thermal and velocity jump strongly reduce the heat transfer rate and skin friction coefficient. Suction decreases the radial and tangential skin friction coefficient and the rate of heat transfer. It is also observed that the effects are more pronounced in the case of shear thinning fluids as compared to shear thickening fluids.Keywords: heat transfer, power-law fluids, rotating disk, suction or injection, temperature jump, velocity slip
Procedia PDF Downloads 2684592 Gray Level Image Encryption
Authors: Roza Afarin, Saeed Mozaffari
Abstract:
The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.Keywords: correlation coefficients, genetic algorithm, image encryption, image entropy
Procedia PDF Downloads 3304591 Topology Optimization Design of Transmission Structure in Flapping-Wing Micro Aerial Vehicle via 3D Printing
Authors: Zuyong Chen, Jianghao Wu, Yanlai Zhang
Abstract:
Flapping-wing micro aerial vehicle (FMAV) is a new type of aircraft by mimicking the flying behavior to that of small birds or insects. Comparing to the traditional fixed wing or rotor-type aircraft, FMAV only needs to control the motion of flapping wings, by changing the size and direction of lift to control the flight attitude. Therefore, its transmission system should be designed very compact. Lightweight design can effectively extend its endurance time, while engineering experience alone is difficult to simultaneously meet the requirements of FMAV for structural strength and quality. Current researches still lack the guidance of considering nonlinear factors of 3D printing material when carrying out topology optimization, especially for the tiny FMAV transmission system. The coupling of non-linear material properties and non-linear contact behaviors of FMAV transmission system is a great challenge to the reliability of the topology optimization result. In this paper, topology optimization design based on FEA solver package Altair Optistruct for the transmission system of FMAV manufactured by 3D Printing was carried out. Firstly, the isotropic constitutive behavior of the Ultraviolet (UV) Cureable Resin used to fabricate the structure of FMAV was evaluated and confirmed through tensile test. Secondly, a numerical computation model describing the mechanical behavior of FMAV transmission structure was established and verified by experiments. Then topology optimization modeling method considering non-linear factors were presented, and optimization results were verified by dynamic simulation and experiments. Finally, detail discussions of different load status and constraints were carried out to explore the leading factors affecting the optimization results. The contributions drawn from this article helpful for guiding the lightweight design of FMAV are summarizing as follow; first, a dynamic simulation modeling method used to obtain the load status is presented. Second, verification method of optimized results considering non-linear factors is introduced. Third, based on or can achieve a better weight reduction effect and improve the computational efficiency rather than taking multi-states into account. Fourth, basing on makes for improving the ability to resist bending deformation. Fifth, constraint of displacement helps to improve the structural stiffness of optimized result. Results and engineering guidance in this paper may shed lights on the structural optimization and light-weight design for future advanced FMAV.Keywords: flapping-wing micro aerial vehicle, 3d printing, topology optimization, finite element analysis, experiment
Procedia PDF Downloads 1704590 Design of a Graphical User Interface for Data Preprocessing and Image Segmentation Process in 2D MRI Images
Authors: Enver Kucukkulahli, Pakize Erdogmus, Kemal Polat
Abstract:
The 2D image segmentation is a significant process in finding a suitable region in medical images such as MRI, PET, CT etc. In this study, we have focused on 2D MRI images for image segmentation process. We have designed a GUI (graphical user interface) written in MATLABTM for 2D MRI images. In this program, there are two different interfaces including data pre-processing and image clustering or segmentation. In the data pre-processing section, there are median filter, average filter, unsharp mask filter, Wiener filter, and custom filter (a filter that is designed by user in MATLAB). As for the image clustering, there are seven different image segmentations for 2D MR images. These image segmentation algorithms are as follows: PSO (particle swarm optimization), GA (genetic algorithm), Lloyds algorithm, k-means, the combination of Lloyds and k-means, mean shift clustering, and finally BBO (Biogeography Based Optimization). To find the suitable cluster number in 2D MRI, we have designed the histogram based cluster estimation method and then applied to these numbers to image segmentation algorithms to cluster an image automatically. Also, we have selected the best hybrid method for each 2D MR images thanks to this GUI software.Keywords: image segmentation, clustering, GUI, 2D MRI
Procedia PDF Downloads 3774589 SEM Image Classification Using CNN Architectures
Authors: Güzi̇n Ti̇rkeş, Özge Teki̇n, Kerem Kurtuluş, Y. Yekta Yurtseven, Murat Baran
Abstract:
A scanning electron microscope (SEM) is a type of electron microscope mainly used in nanoscience and nanotechnology areas. Automatic image recognition and classification are among the general areas of application concerning SEM. In line with these usages, the present paper proposes a deep learning algorithm that classifies SEM images into nine categories by means of an online application to simplify the process. The NFFA-EUROPE - 100% SEM data set, containing approximately 21,000 images, was used to train and test the algorithm at 80% and 20%, respectively. Validation was carried out using a separate data set obtained from the Middle East Technical University (METU) in Turkey. To increase the accuracy in the results, the Inception ResNet-V2 model was used in view of the Fine-Tuning approach. By using a confusion matrix, it was observed that the coated-surface category has a negative effect on the accuracy of the results since it contains other categories in the data set, thereby confusing the model when detecting category-specific patterns. For this reason, the coated-surface category was removed from the train data set, hence increasing accuracy by up to 96.5%.Keywords: convolutional neural networks, deep learning, image classification, scanning electron microscope
Procedia PDF Downloads 1254588 A Study of Using Multiple Subproblems in Dantzig-Wolfe Decomposition of Linear Programming
Authors: William Chung
Abstract:
This paper is to study the use of multiple subproblems in Dantzig-Wolfe decomposition of linear programming (DW-LP). Traditionally, the decomposed LP consists of one LP master problem and one LP subproblem. The master problem and the subproblem is solved alternatively by exchanging the dual prices of the master problem and the proposals of the subproblem until the LP is solved. It is well known that convergence is slow with a long tail of near-optimal solutions (asymptotic convergence). Hence, the performance of DW-LP highly depends upon the number of decomposition steps. If the decomposition steps can be greatly reduced, the performance of DW-LP can be improved significantly. To reduce the number of decomposition steps, one of the methods is to increase the number of proposals from the subproblem to the master problem. To do so, we propose to add a quadratic approximation function to the LP subproblem in order to develop a set of approximate-LP subproblems (multiple subproblems). Consequently, in each decomposition step, multiple subproblems are solved for providing multiple proposals to the master problem. The number of decomposition steps can be reduced greatly. Note that each approximate-LP subproblem is nonlinear programming, and solving the LP subproblem must faster than solving the nonlinear multiple subproblems. Hence, using multiple subproblems in DW-LP is the tradeoff between the number of approximate-LP subproblems being formed and the decomposition steps. In this paper, we derive the corresponding algorithms and provide some simple computational results. Some properties of the resulting algorithms are also given.Keywords: approximate subproblem, Dantzig-Wolfe decomposition, large-scale models, multiple subproblems
Procedia PDF Downloads 1664587 Observer-Based Leader-Following Consensus of Nonlinear Fractional-Order Multi-Agent Systems
Authors: Ali Afaghi, Sehraneh Ghaemi
Abstract:
The coordination of the multi-agent systems has been one of the interesting topic in recent years, because of its potential applications in many branches of science and engineering such as sensor networks, flocking, underwater vehicles and etc. In the most of the related studies, it is assumed that the dynamics of the multi-agent systems are integer-order and linear and the multi-agent systems with the fractional-order nonlinear dynamics are rarely considered. However many phenomena in nature cannot be described within integer-order and linear characteristics. This paper investigates the leader-following consensus problem for a class of nonlinear fractional-order multi-agent systems based on observer-based cooperative control. In the system, the dynamics of each follower and leader are nonlinear. For a multi-agent system with fixed directed topology firstly, an observer-based consensus protocol is proposed based on the relative observer states of neighboring agents. Secondly, based on the property of the stability theory of fractional-order system, some sufficient conditions are presented for the asymptotical stability of the observer-based fractional-order control systems. The proposed method is applied on a five-agent system with the fractional-order nonlinear dynamics and unavailable states. The simulation example shows that the proposed scenario results in the good performance and can be used in many practical applications.Keywords: fractional-order multi-agent systems, leader-following consensus, nonlinear dynamics, directed graphs
Procedia PDF Downloads 3984586 Quality of Service Based Routing Algorithm for Real Time Applications in MANETs Using Ant Colony and Fuzzy Logic
Authors: Farahnaz Karami
Abstract:
Routing is an important, challenging task in mobile ad hoc networks due to node mobility, lack of central control, unstable links, and limited resources. An ant colony has been found to be an attractive technique for routing in Mobile Ad Hoc Networks (MANETs). However, existing swarm intelligence based routing protocols find an optimal path by considering only one or two route selection metrics without considering correlations among such parameters making them unsuitable lonely for routing real time applications. Fuzzy logic combines multiple route selection parameters containing uncertain information or imprecise data in nature, but does not have multipath routing property naturally in order to provide load balancing. The objective of this paper is to design a routing algorithm using fuzzy logic and ant colony that can solve some of routing problems in mobile ad hoc networks, such as nodes energy consumption optimization to increase network lifetime, link failures rate reduction to increase packet delivery reliability and providing load balancing to optimize available bandwidth. In proposed algorithm, the path information will be given to fuzzy inference system by ants. Based on the available path information and considering the parameters required for quality of service (QoS), the fuzzy cost of each path is calculated and the optimal paths will be selected. NS2.35 simulation tools are used for simulation and the results are compared and evaluated with the newest QoS based algorithms in MANETs according to packet delivery ratio, end-to-end delay and routing overhead ratio criterions. The simulation results show significant improvement in the performance of these networks in terms of decreasing end-to-end delay, and routing overhead ratio, and also increasing packet delivery ratio.Keywords: mobile ad hoc networks, routing, quality of service, ant colony, fuzzy logic
Procedia PDF Downloads 644585 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis
Authors: Mehrnaz Mostafavi
Abstract:
The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans
Procedia PDF Downloads 1014584 Quantitative Structure–Activity Relationship Analysis of Some Benzimidazole Derivatives by Linear Multivariate Method
Authors: Strahinja Z. Kovačević, Lidija R. Jevrić, Sanja O. Podunavac Kuzmanović
Abstract:
The relationship between antibacterial activity of eighteen different substituted benzimidazole derivatives and their molecular characteristics was studied using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on inhibitory activity towards Staphylococcus aureus, by using molecular descriptors, as well as minimal inhibitory activity (MIC). Molecular descriptors were calculated from the optimized structures. Principal component analysis (PCA) followed by hierarchical cluster analysis (HCA) and multiple linear regression (MLR) was performed in order to select molecular descriptors that best describe the antibacterial behavior of the compounds investigated, and to determine the similarities between molecules. The HCA grouped the molecules in separated clusters which have the similar inhibitory activity. PCA showed very similar classification of molecules as the HCA, and displayed which descriptors contribute to that classification. MLR equations, that represent MIC as a function of the in silico molecular descriptors were established. The statistical significance of the estimated models was confirmed by standard statistical measures and cross-validation parameters (SD = 0.0816, F = 46.27, R = 0.9791, R2CV = 0.8266, R2adj = 0.9379, PRESS = 0.1116). These parameters indicate the possibility of application of the established chemometric models in prediction of the antibacterial behaviour of studied derivatives and structurally very similar compounds.Keywords: antibacterial, benzimidazole, molecular descriptors, QSAR
Procedia PDF Downloads 3644583 Hyperspectral Data Classification Algorithm Based on the Deep Belief and Self-Organizing Neural Network
Authors: Li Qingjian, Li Ke, He Chun, Huang Yong
Abstract:
In this paper, the method of combining the Pohl Seidman's deep belief network with the self-organizing neural network is proposed to classify the target. This method is mainly aimed at the high nonlinearity of the hyperspectral image, the high sample dimension and the difficulty in designing the classifier. The main feature of original data is extracted by deep belief network. In the process of extracting features, adding known labels samples to fine tune the network, enriching the main characteristics. Then, the extracted feature vectors are classified into the self-organizing neural network. This method can effectively reduce the dimensions of data in the spectrum dimension in the preservation of large amounts of raw data information, to solve the traditional clustering and the long training time when labeled samples less deep learning algorithm for training problems, improve the classification accuracy and robustness. Through the data simulation, the results show that the proposed network structure can get a higher classification precision in the case of a small number of known label samples.Keywords: DBN, SOM, pattern classification, hyperspectral, data compression
Procedia PDF Downloads 3414582 Semi-Supervised Hierarchical Clustering Given a Reference Tree of Labeled Documents
Authors: Ying Zhao, Xingyan Bin
Abstract:
Semi-supervised clustering algorithms have been shown effective to improve clustering process with even limited supervision. However, semi-supervised hierarchical clustering remains challenging due to the complexities of expressing constraints for agglomerative clustering algorithms. This paper proposes novel semi-supervised agglomerative clustering algorithms to build a hierarchy based on a known reference tree. We prove that by enforcing distance constraints defined by a reference tree during the process of hierarchical clustering, the resultant tree is guaranteed to be consistent with the reference tree. We also propose a framework that allows the hierarchical tree generation be aware of levels of levels of the agglomerative tree under creation, so that metric weights can be learned and adopted at each level in a recursive fashion. The experimental evaluation shows that the additional cost of our contraint-based semi-supervised hierarchical clustering algorithm (HAC) is negligible, and our combined semi-supervised HAC algorithm outperforms the state-of-the-art algorithms on real-world datasets. The experiments also show that our proposed methods can improve clustering performance even with a small number of unevenly distributed labeled data.Keywords: semi-supervised clustering, hierarchical agglomerative clustering, reference trees, distance constraints
Procedia PDF Downloads 5474581 Hybrid Lateral-Directional Robust Flight Control with Propulsive Systems
Authors: Alexandra Monteiro, K. Bousson, Fernando J. O. Moreira, Ricardo Reis
Abstract:
Fixed-wing flying vehicles are usually controlled by means of control surfaces such as elevators, ailerons, and rudders. The failure of these systems may lead to severe or even fatal crashes. These failures resulted in increased popularity for research activities on propulsion control in the last decades. The present work deals with a hybrid control architecture in which the propulsion-controlled vehicle maintains its traditional control surfaces, addressing the issue of robust lateral-directional dynamics control. The challenges stem from the parameter uncertainties in the stability and control derivatives and some unknown terms in the flight dynamics model. Two approaches are implemented and tested: linear quadratic regulation with robustness characteristics and H∞ control. The problem is centered on roll-yaw controller design with full state-feedback, which is able to deal with a standalone propulsion control mode as well as a hybrid mode combining both propulsion control and conventional control surface concepts while maintaining the original flight maneuverability characteristics. The results for both controllers emphasized very good control performances; however, the H∞ controller showed higher stabilization rates and robustness albeit with a slightly higher control magnitude than using the linear quadratic regulator.Keywords: robust propulsion control, h-infinity control, lateral-directional flight dynamics, parameter uncertainties
Procedia PDF Downloads 1544580 A Fuzzy Multiobjective Model for Bed Allocation Optimized by Artificial Bee Colony Algorithm
Authors: Jalal Abdulkareem Sultan, Abdulhakeem Luqman Hasan
Abstract:
With the development of health care systems competition, hospitals face more and more pressures. Meanwhile, resource allocation has a vital effect on achieving competitive advantages in hospitals. Selecting the appropriate number of beds is one of the most important sections in hospital management. However, in real situation, bed allocation selection is a multiple objective problem about different items with vagueness and randomness of the data. It is very complex. Hence, research about bed allocation problem is relatively scarce under considering multiple departments, nursing hours, and stochastic information about arrival and service of patients. In this paper, we develop a fuzzy multiobjective bed allocation model for overcoming uncertainty and multiple departments. Fuzzy objectives and weights are simultaneously applied to help the managers to select the suitable beds about different departments. The proposed model is solved by using Artificial Bee Colony (ABC), which is a very effective algorithm. The paper describes an application of the model, dealing with a public hospital in Iraq. The results related that fuzzy multi-objective model was presented suitable framework for bed allocation and optimum use.Keywords: bed allocation problem, fuzzy logic, artificial bee colony, multi-objective optimization
Procedia PDF Downloads 3244579 Design and Field Programmable Gate Array Implementation of Radio Frequency Identification for Boosting up Tag Data Processing
Authors: G. Rajeshwari, V. D. M. Jabez Daniel
Abstract:
Radio Frequency Identification systems are used for automated identification in various applications such as automobiles, health care and security. It is also called as the automated data collection technology. RFID readers are placed in any area to scan large number of tags to cover a wide distance. The placement of the RFID elements may result in several types of collisions. A major challenge in RFID system is collision avoidance. In the previous works the collision was avoided by using algorithms such as ALOHA and tree algorithm. This work proposes collision reduction and increased throughput through reading enhancement method with tree algorithm. The reading enhancement is done by improving interrogation procedure and increasing the data handling capacity of RFID reader with parallel processing. The work is simulated using Xilinx ISE 14.5 verilog language. By implementing this in the RFID system, we can able to achieve high throughput and avoid collision in the reader at a same instant of time. The overall system efficiency will be increased by implementing this.Keywords: antenna, anti-collision protocols, data management system, reader, reading enhancement, tag
Procedia PDF Downloads 3064578 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 5044577 Optimisation of Intermodal Transport Chain of Supermarkets on Isle of Wight, UK
Authors: Jingya Liu, Yue Wu, Jiabin Luo
Abstract:
This work investigates an intermodal transportation system for delivering goods from a Regional Distribution Centre to supermarkets on the Isle of Wight (IOW) via the port of Southampton or Portsmouth in the UK. We consider this integrated logistics chain as a 3-echelon transportation system. In such a system, there are two types of transport methods used to deliver goods across the Solent Channel: one is accompanied transport, which is used by most supermarkets on the IOW, such as Spar, Lidl and Co-operative food; the other is unaccompanied transport, which is used by Aldi. Five transport scenarios are studied based on different transport modes and ferry routes. The aim is to determine an optimal delivery plan for supermarkets of different business scales on IOW, in order to minimise the total running cost, fuel consumptions and carbon emissions. The problem is modelled as a vehicle routing problem with time windows and solved by genetic algorithm. The computing results suggested that accompanied transport is more cost efficient for small and medium business-scale supermarket chains on IOW, while unaccompanied transport has the potential to improve the efficiency and effectiveness of large business scale supermarket chains.Keywords: genetic algorithm, intermodal transport system, Isle of Wight, optimization, supermarket
Procedia PDF Downloads 3694576 2D Hexagonal Cellular Automata: The Complexity of Forms
Authors: Vural Erdogan
Abstract:
We created two-dimensional hexagonal cellular automata to obtain complexity by using simple rules same as Conway’s game of life. Considering the game of life rules, Wolfram's works about life-like structures and John von Neumann's self-replication, self-maintenance, self-reproduction problems, we developed 2-states and 3-states hexagonal growing algorithms that reach large populations through random initial states. Unlike the game of life, we used six neighbourhoods cellular automata instead of eight or four neighbourhoods. First simulations explained that whether we are able to obtain sort of oscillators, blinkers, and gliders. Inspired by Wolfram's 1D cellular automata complexity and life-like structures, we simulated 2D synchronous, discrete, deterministic cellular automata to reach life-like forms with 2-states cells. The life-like formations and the oscillators have been explained how they contribute to initiating self-maintenance together with self-reproduction and self-replication. After comparing simulation results, we decided to develop the algorithm for another step. Appending a new state to the same algorithm, which we used for reaching life-like structures, led us to experiment new branching and fractal forms. All these studies tried to demonstrate that complex life forms might come from uncomplicated rules.Keywords: hexagonal cellular automata, self-replication, self-reproduction, self- maintenance
Procedia PDF Downloads 1524575 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison
Authors: Saugata Bose, Ritambhra Korpal
Abstract:
The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram
Procedia PDF Downloads 3584574 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers
Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice
Abstract:
In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection
Procedia PDF Downloads 4474573 Numerical Simulations on the Torsional Behavior of Multistory Concrete Masonry Buildings
Authors: Alvaro Jose Cordova, Hsuan Teh Hu
Abstract:
The use of concrete masonry constructions in developing countries has become very frequent, especially for domestic purpose. Most of them with asymmetric wall configurations in plan resulting in significant torsional actions when subjected to seismic loads. The study consisted on the finding of a material model for hollow unreinforced concrete masonry and a validation with experimental data found in literature. Numerical simulations were performed to 20 buildings with variations in wall distributions and heights. Results were analyzed by inspection and with a non-linear static method. The findings revealed that eccentricities as well as structure rigidities have a strong influence on the overall response of concrete masonry buildings. In addition, slab rotations depicted more accurate information about the torsional behavior than maximum versus average displacement ratios. The failure modes in low buildings were characterized by high tensile strains in the first floor. Whereas in tall buildings these strains were lowered significantly by higher compression stresses due to a higher self-weight. These tall buildings developed multiple plastic hinges along the height. Finally, the non-linear static analysis exposed a brittle response for all masonry assemblies. This type of behavior is undesired in any construction and the need for a material model for reinforced masonry is pointed out.Keywords: concrete damaged plasticity, concrete masonry, macro-modeling, nonlinear static analysis, torsional capacity
Procedia PDF Downloads 2944572 Application of Random Forest Model in The Prediction of River Water Quality
Authors: Turuganti Venkateswarlu, Jagadeesh Anmala
Abstract:
Excessive runoffs from various non-point source land uses, and other point sources are rapidly contaminating the water quality of streams in the Upper Green River watershed, Kentucky, USA. It is essential to maintain the stream water quality as the river basin is one of the major freshwater sources in this province. It is also important to understand the water quality parameters (WQPs) quantitatively and qualitatively along with their important features as stream water is sensitive to climatic events and land-use practices. In this paper, a model was developed for predicting one of the significant WQPs, Fecal Coliform (FC) from precipitation, temperature, urban land use factor (ULUF), agricultural land use factor (ALUF), and forest land-use factor (FLUF) using Random Forest (RF) algorithm. The RF model, a novel ensemble learning algorithm, can even find out advanced feature importance characteristics from the given model inputs for different combinations. This model’s outcomes showed a good correlation between FC and climate events and land use factors (R2 = 0.94) and precipitation and temperature are the primary influencing factors for FC.Keywords: water quality, land use factors, random forest, fecal coliform
Procedia PDF Downloads 1974571 Control Algorithm Design of Single-Phase Inverter For ZnO Breakdown Characteristics Tests
Authors: Kashif Habib, Zeeshan Ayyub
Abstract:
ZnO voltage dependent resistor was widely used as components of the electrical system for over-voltage protection. It has a wide application prospect in superconducting energy-removal, generator de-excitation, overvoltage protection of electrical & electronics equipment. At present, the research for the application of ZnO voltage dependent resistor stop, it uses just in the field of its nonlinear voltage current characteristic and overvoltage protection areas. There is no further study over the over-voltage breakdown characteristics, such as the combustion phenomena and the measure of the voltage/current when it breakdown, and the affect to its surrounding equipment. It is also a blind spot in its application. So, when we do the feature test of ZnO voltage dependent resistor, we need to design a reasonable test power supply, making the terminal voltage keep for sine wave, simulating the real use of PF voltage in power supply conditions. We put forward the solutions of using inverter to generate a controllable power. The paper mainly focuses on the breakdown characteristic test power supply of nonlinear ZnO voltage dependent resistor. According to the current mature switching power supply technology, we proposed power control system using the inverter as the core. The power mainly realize the sin-voltage output on the condition of three-phase PF-AC input, and 3 control modes (RMS, Peak, Average) of the current output. We choose TMS320F2812M as the control part of the hardware platform. It is used to convert the power from three-phase to a controlled single-phase sin-voltage through a rectifier, filter, and inverter. Design controller produce SPWM, to get the controlled voltage source via appropriate multi-loop control strategy, while execute data acquisition and display, system protection, start logic control, etc. The TMS320F2812M is able to complete the multi-loop control quickly and can be a good completion of the inverter output control.Keywords: ZnO, multi-loop control, SPWM, non-linear load
Procedia PDF Downloads 3254570 Training of Future Computer Science Teachers Based on Machine Learning Methods
Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova
Abstract:
The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.Keywords: algorithm, artificial intelligence, education, machine learning
Procedia PDF Downloads 734569 Sinusoidal Roughness Elements in a Square Cavity
Authors: Muhammad Yousaf, Shoaib Usman
Abstract:
Numerical studies were conducted using Lattice Boltzmann Method (LBM) to study the natural convection in a square cavity in the presence of roughness. An algorithm basedon a single relaxation time Bhatnagar-Gross-Krook (BGK) model of Lattice Boltzmann Method (LBM) was developed. Roughness was introduced on both the hot and cold walls in the form of sinusoidal roughness elements. The study was conducted for a Newtonian fluid of Prandtl number (Pr) 1.0. The range of Ra number was explored from 103 to 106 in a laminar region. Thermal and hydrodynamic behavior of fluid was analyzed using a differentially heated square cavity with roughness elements present on both the hot and cold wall. Neumann boundary conditions were introduced on horizontal walls with vertical walls as isothermal. The roughness elements were at the same boundary condition as corresponding walls. Computational algorithm was validated against previous benchmark studies performed with different numerical methods, and a good agreement was found to exist. Results indicate that the maximum reduction in the average heat transfer was16.66 percent at Ra number 105.Keywords: Lattice Boltzmann method, natural convection, nusselt number, rayleigh number, roughness
Procedia PDF Downloads 527