Search results for: Random Kernel Density
5414 Characterising Stable Model by Extended Labelled Dependency Graph
Authors: Asraful Islam
Abstract:
Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring
Procedia PDF Downloads 2125413 Estimating 3D-Position of a Stationary Random Acoustic Source Using Bispectral Analysis of 4-Point Detected Signals
Authors: Katsumi Hirata
Abstract:
To develop the useful acoustic environmental recognition system, the method of estimating 3D-position of a stationary random acoustic source using bispectral analysis of 4-point detected signals is proposed. The method uses information about amplitude attenuation and propagation delay extracted from amplitude ratios and angles of auto- and cross-bispectra of the detected signals. It is expected that using bispectral analysis affects less influence of Gaussian noises than using conventional power spectral one. In this paper, the basic principle of the method is mentioned first, and its validity and features are considered from results of the fundamental experiments assumed ideal circumstances.Keywords: 4-point detection, a stationary random acoustic source, auto- and cross-bispectra, estimation of 3D-position
Procedia PDF Downloads 3595412 Historical Landscape Affects Present Tree Density in Paddy Field
Authors: Ha T. Pham, Shuichi Miyagawa
Abstract:
Ongoing landscape transformation is one of the major causes behind disappearance of traditional landscapes, and lead to species and resource loss. Tree in paddy fields in the northeast of Thailand is one of those traditional landscapes. Using three different historical time layers, we acknowledged the severe deforestation and rapid urbanization happened in the region. Despite the general thinking of decline in tree density as consequences, the heterogeneous trend of changes in total tree density in three studied landscapes denied the hypothesis that number of trees in paddy field depend on the length of land use practice. On the other hand, due to selection of planting new trees on levees, existence of trees in paddy field are now rely on their values for human use. Besides, changes in land use and landscape structure had a significant impact on decision of which tree density level is considered as suitable for the landscape.Keywords: aerial photographs, land use change, traditional landscape, tree in paddy fields
Procedia PDF Downloads 4195411 Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP
Procedia PDF Downloads 3945410 Density Determination by Dilution for Extra Heavy Oil Residues Obtained Using Molecular Distillation and Supercritical Fluid Extraction as Upgrading and Refining Process
Authors: Oscar Corredor, Alexander Guzman, Adan Leon
Abstract:
Density is a bulk physical property that indicates the quality of a petroleum fraction. It is also a useful property to estimate various physicochemical properties of fraction and petroleum fluids; however, the determination of density of extra heavy residual (EHR) fractions by standard methodologies, (ASTM D70) shows limitations for samples with higher densities than 1.0879 g/cm3. For this reason, a dilution methodology was developed in order to determinate density for those particular fractions, 87 (EHR) fractions were obtained as products of the fractionation of Colombian typical Vacuum Distillation Residual Fractions using molecular distillation (MD) and extraction with Solvent N-hexane in Supercritical Conditions (SFEF) pilot plants. The proposed methodology showed reliable results that can be demonstrated with the standard deviation of repeatability and reproducibility values of 0.0031 and 0.0061 g/ml respectively. In the same way, it was possible to determine densities in fractions EHR up to 1.1647g/cm3 and °API values obtained were ten times less than the water reference value.Keywords: API, density, vacuum residual, molecular distillation, supercritical fluid extraction
Procedia PDF Downloads 2665409 Solving Process Planning and Scheduling with Number of Operation Plus Processing Time Due-Date Assignment Concurrently Using a Genetic Search
Authors: Halil Ibrahim Demir, Alper Goksu, Onur Canpolat, Caner Erden, Melek Nur
Abstract:
Traditionally process planning, scheduling and due date assignment are performed sequentially and separately. High interrelation between these functions makes integration very useful. Although there are numerous works on integrated process planning and scheduling and many works on scheduling with due date assignment, there are only a few works on the integration of these three functions. Here we tested the different integration levels of these three functions and found a fully integrated version as the best. We applied genetic search and random search and genetic search was found better compared to the random search. We penalized all earliness, tardiness and due date related costs. Since all these three terms are all undesired, it is better to penalize all of them.Keywords: process planning, scheduling, due-date assignment, genetic algorithm, random search
Procedia PDF Downloads 3755408 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types
Authors: Chaghoub Soraya, Zhang Xiaoyan
Abstract:
This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.Keywords: approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median
Procedia PDF Downloads 2035407 A New Mathematical Method for Heart Attack Forecasting
Authors: Razi Khalafi
Abstract:
Myocardial Infarction (MI) or acute Myocardial Infarction (AMI), commonly known as a heart attack, occurs when blood flow stops to part of the heart causing damage to the heart muscle. An ECG can often show evidence of a previous heart attack or one that's in progress. The patterns on the ECG may indicate which part of your heart has been damaged, as well as the extent of the damage. In chaos theory, the correlation dimension is a measure of the dimensionality of the space occupied by a set of random points, often referred to as a type of fractal dimension. In this research by considering ECG signal as a random walk we work on forecasting the oncoming heart attack by analysing the ECG signals using the correlation dimension. In order to test the model a set of ECG signals for patients before and after heart attack was used and the strength of model for forecasting the behaviour of these signals were checked. Results show this methodology can forecast the ECG and accordingly heart attack with high accuracy.Keywords: heart attack, ECG, random walk, correlation dimension, forecasting
Procedia PDF Downloads 5065406 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm
Authors: Thanh Noi Phan, Martin Kappas, Jan Degener
Abstract:
The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam
Procedia PDF Downloads 3875405 Using Machine Learning to Enhance Win Ratio for College Ice Hockey Teams
Authors: Sadixa Sanjel, Ahmed Sadek, Naseef Mansoor, Zelalem Denekew
Abstract:
Collegiate ice hockey (NCAA) sports analytics is different from the national level hockey (NHL). We apply and compare multiple machine learning models such as Linear Regression, Random Forest, and Neural Networks to predict the win ratio for a team based on their statistics. Data exploration helps determine which statistics are most useful in increasing the win ratio, which would be beneficial to coaches and team managers. We ran experiments to select the best model and chose Random Forest as the best performing. We conclude with how to bridge the gap between the college and national levels of sports analytics and the use of machine learning to enhance team performance despite not having a lot of metrics or budget for automatic tracking.Keywords: NCAA, NHL, sports analytics, random forest, regression, neural networks, game predictions
Procedia PDF Downloads 1145404 Random Vertical Seismic Vibrations of the Long Span Cantilever Beams
Authors: Sergo Esadze
Abstract:
Seismic resistance norms require calculation of cantilevers on vertical components of the base seismic acceleration. Long span cantilevers, as a rule, must be calculated as a separate construction element. According to the architectural-planning solution, functional purposes and environmental condition of a designing buildings/structures, long span cantilever construction may be of very different types: both by main bearing element (beam, truss, slab), and by material (reinforced concrete, steel). A choice from these is always linked with bearing construction system of the building. Research of vertical seismic vibration of these constructions requires individual approach for each (which is not specified in the norms) in correlation with model of seismic load. The latest may be given both as deterministic load and as a random process. Loading model as a random process is more adequate to this problem. In presented paper, two types of long span (from 6m – up to 12m) reinforcement concrete cantilever beams have been considered: a) bearing elements of cantilevers, i.e., elements in which they fixed, have cross-sections with large sizes and cantilevers are made with haunch; b) cantilever beam with load-bearing rod element. Calculation models are suggested, separately for a) and b) types. They are presented as systems with finite quantity degree (concentrated masses) of freedom. Conditions for fixing ends are corresponding with its types. Vertical acceleration and vertical component of the angular acceleration affect masses. Model is based on assumption translator-rotational motion of the building in the vertical plane, caused by vertical seismic acceleration. Seismic accelerations are considered as random processes and presented by multiplication of the deterministic envelope function on stationary random process. Problem is solved within the framework of the correlation theory of random process. Solved numerical examples are given. The method is effective for solving the specific problems.Keywords: cantilever, random process, seismic load, vertical acceleration
Procedia PDF Downloads 1885403 The Effect of Hydrogen on the Magnetic Properties of ZnO: A Density Functional Tight Binding Study
Authors: M. A. Lahmer, K. Guergouri
Abstract:
The ferromagnetic properties of carbon-doped ZnO (ZnO:CO) and hydrogenated carbon-doped ZnO (ZnO:CO+H) are investigated using the density functional tight binding (DFTB) method. Our results reveal that CO-doped ZnO is a ferromagnetic material with a magnetic moment of 1.3 μB per carbon atom. The presence of hydrogen in the material in the form of CO-H complex decreases the total magnetism of the material without suppressing ferromagnetism. However, the system in this case becomes quickly antiferromagnetic when the C-C separation distance was increased.Keywords: ZnO, carbon, hydrogen, ferromagnetism, density functional tight binding
Procedia PDF Downloads 2855402 Predictive Analytics of Student Performance Determinants
Authors: Mahtab Davari, Charles Edward Okon, Somayeh Aghanavesi
Abstract:
Every institute of learning is usually interested in the performance of enrolled students. The level of these performances determines the approach an institute of study may adopt in rendering academic services. The focus of this paper is to evaluate students' academic performance in given courses of study using machine learning methods. This study evaluated various supervised machine learning classification algorithms such as Logistic Regression (LR), Support Vector Machine, Random Forest, Decision Tree, K-Nearest Neighbors, Linear Discriminant Analysis, and Quadratic Discriminant Analysis, using selected features to predict study performance. The accuracy, precision, recall, and F1 score obtained from a 5-Fold Cross-Validation were used to determine the best classification algorithm to predict students’ performances. SVM (using a linear kernel), LDA, and LR were identified as the best-performing machine learning methods. Also, using the LR model, this study identified students' educational habits such as reading and paying attention in class as strong determinants for a student to have an above-average performance. Other important features include the academic history of the student and work. Demographic factors such as age, gender, high school graduation, etc., had no significant effect on a student's performance.Keywords: student performance, supervised machine learning, classification, cross-validation, prediction
Procedia PDF Downloads 1265401 Using Combination of Sets of Features of Molecules for Aqueous Solubility Prediction: A Random Forest Model
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Generally, absorption and bioavailability increase if solubility increases; therefore, it is crucial to predict them in drug discovery applications. Molecular descriptors and Molecular properties are traditionally used for the prediction of water solubility. There are various key descriptors that are used for this purpose, namely Drogan Descriptors, Morgan Descriptors, Maccs keys, etc., and each has different prediction capabilities with differentiating successes between different data sets. Another source for the prediction of solubility is structural features; they are commonly used for the prediction of solubility. However, there are little to no studies that combine three or more properties or descriptors for prediction to produce a more powerful prediction model. Unlike available models, we used a combination of those features in a random forest machine learning model for improved solubility prediction to better predict and, therefore, contribute to drug discovery systems.Keywords: solubility, random forest, molecular descriptors, maccs keys
Procedia PDF Downloads 465400 Exploring the Intrinsic Ecology and Suitable Density of Historic Districts Through a Comparative Analysis of Ancient and Modern Ecological Smart Practices
Authors: Hu Changjuan, Gong Cong, Long Hao
Abstract:
Although urban ecological policies and the public's aspiration for livable environments have expedited the pace of ecological revitalization, historic districts that have evolved through natural ecological processes often become obsolete and less habitable amid rapid urbanization. This raises a critical question about historic districts inherently incapable of being ecological and livable. The thriving concept of ‘intrinsic ecology,’ characterized by its ability to transform city-district systems into healthy ecosystems with diverse environments, stable functions, and rapid restoration capabilities, holds potential for guiding the integration of ancient and modern ecological wisdom while supporting the dynamic involvement of cultures. This study explores the intrinsic ecology of historic districts from three aspects: 1) Population Density: By comparing the population density before urban population expansion to the present day, determine the reasonable population density for historic districts. 2) Building Density: Using the ‘Space-mate’ tool for comparative analysis, form a spatial matrix to explore the intrinsic ecology of building density in Chinese historic districts. 3) Green Capacity Ratio: By using ecological districts as control samples, conduct dual comparative analyses (related comparison and upgraded comparison) to determine the intrinsic ecological advantages of the two-dimensional and three-dimensional green volume in historic districts. The study inform a density optimization strategy that supports cultural, social, natural, and economic ecology, contributing to the creation of eco-historic districts.Keywords: eco-historic districts, intrinsic ecology, suitable density, green capacity ratio.
Procedia PDF Downloads 235399 Optimization of Machine Learning Regression Results: An Application on Health Expenditures
Authors: Songul Cinaroglu
Abstract:
Machine learning regression methods are recommended as an alternative to classical regression methods in the existence of variables which are difficult to model. Data for health expenditure is typically non-normal and have a heavily skewed distribution. This study aims to compare machine learning regression methods by hyperparameter tuning to predict health expenditure per capita. A multiple regression model was conducted and performance results of Lasso Regression, Random Forest Regression and Support Vector Machine Regression recorded when different hyperparameters are assigned. Lambda (λ) value for Lasso Regression, number of trees for Random Forest Regression, epsilon (ε) value for Support Vector Regression was determined as hyperparameters. Study results performed by using 'k' fold cross validation changed from 5 to 50, indicate the difference between machine learning regression results in terms of R², RMSE and MAE values that are statistically significant (p < 0.001). Study results reveal that Random Forest Regression (R² ˃ 0.7500, RMSE ≤ 0.6000 ve MAE ≤ 0.4000) outperforms other machine learning regression methods. It is highly advisable to use machine learning regression methods for modelling health expenditures.Keywords: machine learning, lasso regression, random forest regression, support vector regression, hyperparameter tuning, health expenditure
Procedia PDF Downloads 2265398 Quantitative Assessment of Soft Tissues by Statistical Analysis of Ultrasound Backscattered Signals
Authors: Da-Ming Huang, Ya-Ting Tsai, Shyh-Hau Wang
Abstract:
Ultrasound signals backscattered from the soft tissues are mainly depending on the size, density, distribution, and other elastic properties of scatterers in the interrogated sample volume. The quantitative analysis of ultrasonic backscattering is frequently implemented using the statistical approach due to that of backscattering signals tends to be with the nature of the random variable. Thus, the statistical analysis, such as Nakagami statistics, has been applied to characterize the density and distribution of scatterers of a sample. Yet, the accuracy of statistical analysis could be readily affected by the receiving signals associated with the nature of incident ultrasound wave and acoustical properties of samples. Thus, in the present study, efforts were made to explore such effects as the ultrasound operational modes and attenuation of biological tissue on the estimation of corresponding Nakagami statistical parameter (m parameter). In vitro measurements were performed from healthy and pathological fibrosis porcine livers using different single-element ultrasound transducers and duty cycles of incident tone burst ranging respectively from 3.5 to 7.5 MHz and 10 to 50%. Results demonstrated that the estimated m parameter tends to be sensitively affected by the use of ultrasound operational modes as well as the tissue attenuation. The healthy and pathological tissues may be characterized quantitatively by m parameter under fixed measurement conditions and proper calibration.Keywords: ultrasound backscattering, statistical analysis, operational mode, attenuation
Procedia PDF Downloads 3235397 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp
Procedia PDF Downloads 3465396 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity
Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Saifur Rahman Sabuj
Abstract:
This paper examines relationships between solar activity and earthquakes; it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to affect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth.Keywords: k-nearest neighbour, support vector regression, random forest regression, long short-term memory network, earthquakes, solar activity, sunspot number, solar wind, solar flares
Procedia PDF Downloads 735395 Designing Stochastic Non-Invasively Applied DC Pulses to Suppress Tremors in Multiple Sclerosis by Computational Modeling
Authors: Aamna Lawrence, Ashutosh Mishra
Abstract:
Tremors occur in 60% of the patients who have Multiple Sclerosis (MS), the most common demyelinating disease that affects the central and peripheral nervous system, and are the primary cause of disability in young adults. While pharmacological agents provide minimal benefits, surgical interventions like Deep Brain Stimulation and Thalamotomy are riddled with dangerous complications which make non-invasive electrical stimulation an appealing treatment of choice for dealing with tremors. Hence, we hypothesized that if the non-invasive electrical stimulation parameters (mainly frequency) can be computed by mathematically modeling the nerve fibre to take into consideration the minutest details of the axon morphologies, tremors due to demyelination can be optimally alleviated. In this computational study, we have modeled the random demyelination pattern in a nerve fibre that typically manifests in MS using the High-Density Hodgkin-Huxley model with suitable modifications to account for the myelin. The internode of the nerve fibre in our model could have up to ten demyelinated regions each having random length and myelin thickness. The arrival time of action potentials traveling the demyelinated and the normally myelinated nerve fibre between two fixed points in space was noted, and its relationship with the nerve fibre radius ranging from 5µm to 12µm was analyzed. It was interesting to note that there were no overlaps between the arrival time for action potentials traversing the demyelinated and normally myelinated nerve fibres even when a single internode of the nerve fibre was demyelinated. The study gave us an opportunity to design DC pulses whose frequency of application would be a function of the random demyelination pattern to block only the delayed tremor-causing action potentials. The DC pulses could be delivered to the peripheral nervous system non-invasively by an electrode bracelet that would suppress any shakiness beyond it thus paving the way for wearable neuro-rehabilitative technologies.Keywords: demyelination, Hodgkin-Huxley model, non-invasive electrical stimulation, tremor
Procedia PDF Downloads 1285394 Random Access in IoT Using Naïve Bayes Classification
Authors: Alhusein Almahjoub, Dongyu Qiu
Abstract:
This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation
Procedia PDF Downloads 1455393 Use of Sentiel-2 Data to Monitor Plant Density and Establishment Rate of Winter Wheat Fields
Authors: Bing-Bing E. Goh
Abstract:
Plant counting is a labour intensive and time-consuming task for the farmers. However, it is an important indicator for farmers to make decisions on subsequent field management. This study is to evaluate the potential of Sentinel-2 images using statistical analysis to retrieve information on plant density for monitoring, especially during critical period at the beginning of March. The model was calibrated with in-situ data from 19 winter wheat fields in Republic of Ireland during the crop growing season in 2019-2020. The model for plant density resulted in R2 = 0.77, RMSECV = 103 and NRMSE = 14%. This study has shown the potential of using Sentinel-2 to estimate plant density and quantify plant establishment to effectively monitor crop progress and to ensure proper field management.Keywords: winter wheat, remote sensing, crop monitoring, multivariate analysis
Procedia PDF Downloads 1615392 Particle Filter Supported with the Neural Network for Aircraft Tracking Based on Kernel and Active Contour
Authors: Mohammad Izadkhah, Mojtaba Hoseini, Alireza Khalili Tehrani
Abstract:
In this paper we presented a new method for tracking flying targets in color video sequences based on contour and kernel. The aim of this work is to overcome the problem of losing target in changing light, large displacement, changing speed, and occlusion. The proposed method is made in three steps, estimate the target location by particle filter, segmentation target region using neural network and find the exact contours by greedy snake algorithm. In the proposed method we have used both region and contour information to create target candidate model and this model is dynamically updated during tracking. To avoid the accumulation of errors when updating, target region given to a perceptron neural network to separate the target from background. Then its output used for exact calculation of size and center of the target. Also it is used as the initial contour for the greedy snake algorithm to find the exact target's edge. The proposed algorithm has been tested on a database which contains a lot of challenges such as high speed and agility of aircrafts, background clutter, occlusions, camera movement, and so on. The experimental results show that the use of neural network increases the accuracy of tracking and segmentation.Keywords: video tracking, particle filter, greedy snake, neural network
Procedia PDF Downloads 3415391 Random Walks and Option Pricing for European and American Options
Authors: Guillaume Leduc
Abstract:
In this paper, we describe a broad setting under which the error of the approximation can be quantified, controlled, and for which convergence occurs at a speed of n⁻¹ for European and American options. We describe how knowledge of the error allows for arbitrarily fast acceleration of the convergence.Keywords: random walk approximation, European and American options, rate of convergence, option pricing
Procedia PDF Downloads 4635390 Cell Biomass and Lipid Productivities of Meyerella planktonica under Autotrophic and Heterotrophic Growth Conditions
Authors: Rory Anthony Hutagalung, Leonardus Widjaja
Abstract:
Microalgae Meyerella planktonica is a potential biofuel source because it can grow in bulk in either autotrophic or heterotrophic condition. However, the quantitative growth of this algal type is still low as it tends to precipitates on the bottom. Beside, the lipid concentration is still low when grown in autotrophic condition. In contrast, heterotrophic condition can enhance the lipid concentration. The combination of autotrophic condition and agitation treatment was conducted to increase the density of the culture. On the other hand, a heterotrophic condition was set up to raise the lipid production. A two-stage experiment was applied to increase the density at the first step and to increase the lipid concentration in the next step. The autotrophic condition resulted higher density but lower lipid concentration compared to heterotrophic one. The agitation treatment produced higher density in both autotrophic and heterotrophic conditions. The two-stage experiment managed to enhance the density during the autotrophic stage and the lipid concentration during the heterotrophic stage. The highest yield was performed by using 0.4% v/v glycerol as a carbon source (2.9±0.016 x 106 cells w/w) attained 7 days after the heterotrophic stage began. The lipid concentration was stable starting from day 7.Keywords: agitation, glycerol, heterotrophic, lipid productivity, Meyerella planktonica
Procedia PDF Downloads 3375389 Living at Density: Resident Perceptions in Auckland, New Zealand
Authors: Errol J. Haarhoff
Abstract:
Housing in New Zealand, particularly in Auckland, is dominated by low-density suburbs. Over the past 20 years, housing intensification policies aimed to curb outward low-density sprawl and to concentrate development within an urban boundary have been implemented. This requires the greater deployment of attached housing typologies such apartments, duplexes and terrace housing. There has been strong market response and uptake for higher density development, with the number of building approvals received by the Auckland Council for attached housing units increasing from around 15 percent in 2012/13, to 54 percent in 2017/18. A key question about intensification and strong market uptake in a city where lower density has been the norm, is whether higher density neighborhoods will deliver necessary housing satisfaction? This paper reports on the findings to a questionnaire survey and focus group discussions probing resident perceptions to living at higher density in relation to their dwellings, the neighborhood and their sense of community. The findings reveal strong overall housing satisfaction, including key aspects such as privacy, noise and living in close proximity to neighbors. However, when residents are differentiated in terms of length of tenure, age or whether they are bringing up children, greater variation in satisfaction is detected. For example, residents in the 65-plus age cohort express much higher levels of satisfaction, when compared to the 18-44 year cohorts who more likely to be binging up children. This suggests greater design sensitivity to better accommodate the range of household types. Those who have live in the area longer express greater satisfaction than those with shorter duration, indicating time for adaption to living at higher density. Findings strongly underpin the instrumental role that the public amenities play in overall housing satisfaction and the emergence of a strong sense of community. This underscores the necessity for appropriate investment in the public amenities often lacking in market-led higher density housing development. We conclude with an evaluation of the PPP model, and its part in delivering housing satisfaction. The findings should be of interest to cities, housing developers and built environment professional pursuing housing policies promoting intensification and higher density.Keywords: medium density, housing satisfaction, neighborhoods, sense of community
Procedia PDF Downloads 1375388 Seven Years Assessment on the Suitability of Cocoa Clones Cultivation in High-Density Planting and Its Management in Malaysia
Authors: O. Rozita, N. M. Nik Aziz
Abstract:
High-density planting is usually recommended for a small area of planting in order to increase production. The normal planting distance for cocoa (Theobroma cacao L.) in Malaysia is 3 m x 3 m. The study was conducted at Cocoa Research and Development Centre, Malaysia Cocoa Board, Jengka, Pahang with the objectives to evaluate the suitability of seven cocoa clones under four different planting densities and to study the interaction between cocoa clones and planting densities. The study was arranged in the split plot with randomized complete block design and replicated three times. The cocoa clone was assigned as the main plot and planting density was assigned as a subplot. The clones used in this study were PBC 123, PBC 112, MCBC4, MCBC 5, QH 1003, QH 22, and BAL 244. The planting distance were 3 m x 3 m (1000 stands/ha), 3 m x 1.5 m (2000 stands/ha), 3 m x 1 m (3000 stands/ha) and (1.5 m x 1.5 m) x 3 m (3333 stands/ha). Evaluation on yield performance was carried out for seven years. Clones of PBC 123, QH 1003, and QH 22 obtained the higher yield, meanwhile MCBC 4, MCBC 5, and BAL 244 obtained the lowest yield. In general, high-density planting can increase cocoa production with good management practices. Among the cocoa management practices, the selection of suitable clones with small branching habits and moderately vigorous and proper pruning activity were the most important factor in high-density planting.Keywords: clones, management, planting density, Theobroma cacao, yield
Procedia PDF Downloads 3755387 1H-NMR Spectra of Diesel-Biodiesel Blends to Evaluate the Quality and Determine the Adulteration of Biodiesel with Vegetable Oil
Authors: Luis F. Bianchessi, Gustavo G. Shimamoto, Matthieu Tubino
Abstract:
The use of biodiesel has been diffused in Brazil and all over the world by the trading of biodiesel (B100). In Brazil, the diesel oil currently being sold is a blend, containing 7% biodiesel (B7). In this context, it is necessary to develop methods capable of identifying this blend composition, especially regarding the biodiesel quality used for making these blends. In this study, hydrogen nuclear magnetic resonance spectra (1H-NMR) are proposed as a form of identifying and confirming the quality of type B10 blends (10% of biodiesel and 90% of diesel). Furthermore, the presence of vegetable oils, which may be from fuel adulteration or as an evidence of low degree of transesterification conversion during the synthesis of B100, may also be identified. Mixtures of diesel, vegetable oils and their respective biodiesel were prepared. Soybean oil and macauba kernel oil were used as raw material. The diesel proportion remained fixed at 90%. The other proportion (10%) was varied in terms of vegetable oil and biodiesel. The 1H-NMR spectra were obtained for each one of the mixtures, in order to find a correlation between the spectra and the amount of biodiesel, as well as the amount of residual vegetable oil. The ratio of the integral of the methylenic hydrogen H-2 of glycerol (exclusive of vegetable oil) with respect to the integral of the olefinic hydrogens (present in vegetable oil and biodiesel) was obtained. These ratios were correlated with the percentage of vegetable oil in each mixture, from 0% to 10%. The obtained correlation could be described by linear relationships with R2 of 0.9929 for soybean biodiesel and 0.9982 for macauba kernel biodiesel. Preliminary results show that the technique can be used to monitor the biodiesel quality in commercial diesel-biodiesel blends, besides indicating possible adulteration.Keywords: biodiesel, diesel, biodiesel quality, adulteration
Procedia PDF Downloads 6235386 A Creative Strategy to Functionalize TiN/CNC Composites as Cathode for High-Energy Zinc Ion Capacitors
Authors: Ye Ling, Jiang Yuting, Ruan Haihui
Abstract:
Zinc ion capacitors (ZICs) have garnered tremendous interest recently from researchers due to the perfect integration of batteries and supercapacitors (SC). However, ZICs are currently still facing two major challenges, one is low specific capacitance because of the limited capacity of capacitive cathode materials. In this work, TiN/CNC composites were obtained by a creative method composed of simple mixing and calcination treatment of tetrabutyl titanate (TBOT) and ZIF-8. The formed TiN particles are of ultra-small size and distributed uniformly on the nanoporous carbon matrix, which enhances the conductivity of the composites and the micropores caused by the evaporation of zinc during the calcination process and can serve as the reservoir of electrolytes; both are beneficial to zinc ion storage. When it was used as a cathode with zinc metal and 2M ZnSO₄ as the anode and electrolyte, respectively, in a ZIC device, the assembled device delivered a maximum energy density as high as 153 Wh kg-¹ at a power density of 269.4 W kg-¹, which is superior to many ZICs as reported. Also, it can maintain an energy density of 83.7 Wh kg-¹ at a peak power density of 8.6 kW kg-¹, exhibiting good rate performance. Moreover, when it was charged/discharged for 5000 cycles at a current density of 5 A g-¹, it remained at 85.8% of the initial capacity with a Coulombic efficiency (CE) of nearly 100%.Keywords: zinc ion capacitor, metal nitride, zif-8, supercapacitor
Procedia PDF Downloads 445385 A GIS-Based Study on Geographical Divisions of Sustainable Human Settlements in China
Authors: Wu Yiqun, Weng Jiantao
Abstract:
The human settlements of China are picked up from the land use vector map by interpreting the Thematic Map of 2014. This paper established the sustainable human settlements geographical division evaluation system and division model using GIS. The results show that: The density of human residential areas in China is different, and the density of sustainable human areas is higher, and the west is lower than that in the West. The regional differences of sustainable human settlements are obvious: the north is larger than that the south, the plain regions are larger than those of the hilly regions, and the developed regions are larger than the economically developed regions. The geographical distribution of the sustainable human settlements is measured by the degree of porosity. The degree of porosity correlates with the sustainable human settlement density. In the area where the sustainable human settlement density is high the porosity is low, the distribution is even and the gap between the settlements is low.Keywords: GIS, geographical division, sustainable human settlements, China
Procedia PDF Downloads 599