Search results for: random number
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11809

Search results for: random number

11539 An Efficient Acquisition Algorithm for Long Pseudo-Random Sequence

Authors: Wan-Hsin Hsieh, Chieh-Fu Chang, Ming-Seng Kao

Abstract:

In this paper, a novel method termed the Phase Coherence Acquisition (PCA) is proposed for pseudo-random (PN) sequence acquisition. By employing complex phasors, the PCA requires only complex additions in the order of N, the length of the sequence, whereas the conventional method utilizing fast Fourier transform (FFT) requires complex multiplications and additions both in the order of Nlog2N . In order to combat noise, the input and local sequences are partitioned and mapped into complex phasors in PCA. The phase differences between pairs of input and local phasors are utilized for acquisition, and thus complex multiplications are avoided. For more noise-robustness capability, the multi-layer PCA is developed to extract the code phase step by step. The significant reduction of computational loads makes the PCA an attractive method, especially when the sequence length of is extremely large which becomes intractable for the FFT-based acquisition.

Keywords: FFT, PCA, PN sequence, convolution theory

Procedia PDF Downloads 478
11538 Spatial Rank-Based High-Dimensional Monitoring through Random Projection

Authors: Chen Zhang, Nan Chen

Abstract:

High-dimensional process monitoring becomes increasingly important in many application domains, where usually the process distribution is unknown and much more complicated than the normal distribution, and the between-stream correlation can not be neglected. However, since the process dimension is generally much bigger than the reference sample size, most traditional nonparametric multivariate control charts fail in high-dimensional cases due to the curse of dimensionality. Furthermore, when the process goes out of control, the influenced variables are quite sparse compared with the whole dimension, which increases the detection difficulty. Targeting at these issues, this paper proposes a new nonparametric monitoring scheme for high-dimensional processes. This scheme first projects the high-dimensional process into several subprocesses using random projections for dimension reduction. Then, for every subprocess with the dimension much smaller than the reference sample size, a local nonparametric control chart is constructed based on the spatial rank test to detect changes in this subprocess. Finally, the results of all the local charts are fused together for decision. Furthermore, after an out-of-control (OC) alarm is triggered, a diagnostic framework is proposed. using the square-root LASSO. Numerical studies demonstrate that the chart has satisfactory detection power for sparse OC changes and robust performance for non-normally distributed data, The diagnostic framework is also effective to identify truly changed variables. Finally, a real-data example is presented to demonstrate the application of the proposed method.

Keywords: random projection, high-dimensional process control, spatial rank, sequential change detection

Procedia PDF Downloads 299
11537 Stock Price Prediction with 'Earnings' Conference Call Sentiment

Authors: Sungzoon Cho, Hye Jin Lee, Sungwhan Jeon, Dongyoung Min, Sungwon Lyu

Abstract:

Major public corporations worldwide use conference calls to report their quarterly earnings. These 'earnings' conference calls allow for questions from stock analysts. We investigated if it is possible to identify sentiment from the call script and use it to predict stock price movement. We analyzed call scripts from six companies, two each from Korea, China and Indonesia during six years 2011Q1 – 2017Q2. Random forest with Frequency-based sentiment scores using Loughran MacDonald Dictionary did better than control model with only financial indicators. When the stock prices went up 20 days from earnings release, our model predicted correctly 77% of time. When the model predicted 'up,' actual stock prices went up 65% of time. This preliminary result encourages us to investigate advanced sentiment scoring methodologies such as topic modeling, auto-encoder, and word2vec variants.

Keywords: earnings call script, random forest, sentiment analysis, stock price prediction

Procedia PDF Downloads 294
11536 Implementation and Challenges of Assessment Methods in the Case of Physical Education Class in Some Selected Preparatory Schools of Kirkos Sub-City

Authors: Kibreab Alene Fenite

Abstract:

The purpose of this study is to investigate the implementation and challenges of different assessment methods for physical education class in some selected preparatory schools of kirkos sub city. The participants in this study are teachers, students, department heads and school principals from 4 selected schools. Of the total 8 schools offering in kirkos sub city 4 schools (Dandi Boru, Abiyot Kirse, Assay, and Adey Ababa) are selected by using simple random sampling techniques and from these schools all (100%) of teachers, 100% of department heads and school principals are taken as a sample as their number is manageable. From the total 2520 students, 252 (10%) of students are selected using simple random sampling. Accordingly, 13 teachers, 252 students, 4 department heads and 4 school principals are taken as a sample from the 4 selected schools purposefully. As a method of data gathering tools; questionnaire and interview are employed. To analyze the collected data, both quantitative and qualitative methods are used. The result of the study revealed that assessment in physical education does not implement properly: lack of sufficient materials, inadequate time allotment, large class size, and lack of collaboration and working together of teachers towards assessing the performance of students, absence of guidelines to assess the physical education subject, no different assessment method that is implementing on students with disabilities in line with their special need are found as major challenges in implementing the current assessment method of physical education. To overcome these problems the following recommendations have been forwarded. These are: the necessary facilities and equipment should be available; In order to make reliable, accurate, objective and relevant assessment, teachers of physical education should be familiarized with different assessment techniques; Physical education assessment guidelines should be prepared, and guidelines should include different types of assessment methods; qualified teachers should be employed, and different teaching room must be build.

Keywords: assessment, challenges, equipment, guidelines, implementation, performance

Procedia PDF Downloads 282
11535 A Study of Cost and Revenue Earned from Tourist Walking Street Activities in Songkhla City Municipality, Thailand

Authors: Weerawan Marangkun

Abstract:

This study is a survey intended to investigate cost, revenue and factors affecting changes in revenue and to provide guidelines for improving factors affecting changes in revenue from tourist walking street activities in Songkhla City Municipality. Instruments used in this study were structured interviews, using Yaman table (1973) where the random sampling error was+ 10%. The sample consisting of 83 entrepreneurs were drawn from a total population of 272. The purposive sampling method was used. Data were collected during the 6-month period from December 2011 until May 2012. The findings indicate that the cost paid by an entrepreneur in connection with his/her services for tourists is mainly for travel, which stands at about 290 Baht per day. Each entrepreneur earns about 3,850 Baht per day, which means about 400,000 Baht per year. The combined total revenue from walking street tourist activities is about 108.8 million Baht per year. Such activities add economic value to tourist facilities due to expenditures by tourists and provide the entrepreneurs with considerable income. Factors affecting changes in revenue from tourist walking street activities are: the increase in the number of entrepreneurs; the holding of trade fairs, events or interesting shows in the vicinity; and weather conditions (e.g. abundant rainfall, which can contribute to a decrease in the number of tourists). Suggested measures to improve factors affecting changes in the income are: addition or creation of new activities; regulation of operations of the stalls and parking area; and generation of greater publicity through the social network.

Keywords: cost, revenue, tourist, walking street

Procedia PDF Downloads 362
11534 First Digit Lucas, Fibonacci and Benford Number in Financial Statement

Authors: Teguh Sugiarto, Amir Mohamadian Amiri

Abstract:

Background: This study aims to explore if there is fraud in the company's financial report distribution using the number first digit Lucas, Fibonacci and Benford. Research methods: In this study, the author uses a number model contained in the first digit of the model Lucas, Fibonacci and Benford, to make a distinction between implementation by using the scale above and below 5%, the rate of occurrence of a difference against the digit number contained on Lucas, Fibonacci and Benford. If there is a significant difference above and below 5%, then the process of follow-up and detection of occurrence of fraud against the financial statements can be made. Findings: From research that has been done can be concluded that the number of frequency levels contained in the financial statements of PT Bank BRI Tbk in a year in the same conscientious results for model Lucas, Fibonacci and Benford.

Keywords: Lucas, Fibonacci, Benford, first digit

Procedia PDF Downloads 274
11533 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data

Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang

Abstract:

Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.

Keywords: biomarker, congenital heart defects, DNA methylation, random forest

Procedia PDF Downloads 159
11532 Nonlinear Vibration of FGM Plates Subjected to Acoustic Load in Thermal Environment Using Finite Element Modal Reduction Method

Authors: Hassan Parandvar, Mehrdad Farid

Abstract:

In this paper, a finite element modeling is presented for large amplitude vibration of functionally graded material (FGM) plates subjected to combined random pressure and thermal load. The material properties of the plates are assumed to vary continuously in the thickness direction by a simple power law distribution in terms of the volume fractions of the constituents. The material properties depend on the temperature whose distribution along the thickness can be expressed explicitly. The von Karman large deflection strain displacement and extended Hamilton's principle are used to obtain the governing system of equations of motion in structural node degrees of freedom (DOF) using finite element method. Three-node triangular Mindlin plate element with shear correction factor is used. The nonlinear equations of motion in structural degrees of freedom are reduced by using modal reduction method. The reduced equations of motion are solved numerically by 4th order Runge-Kutta scheme. In this study, the random pressure is generated using Monte Carlo method. The modeling is verified and the nonlinear dynamic response of FGM plates is studied for various values of volume fraction and sound pressure level under different thermal loads. Snap-through type behavior of FGM plates is studied too.

Keywords: nonlinear vibration, finite element method, functionally graded material (FGM) plates, snap-through, random vibration, thermal effect

Procedia PDF Downloads 263
11531 Fast and Robust Long-term Tracking with Effective Searching Model

Authors: Thang V. Kieu, Long P. Nguyen

Abstract:

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Keywords: correlation filter, long-term tracking, random fern, real-time tracking

Procedia PDF Downloads 139
11530 The Effect of Spatial Variability on Axial Pile Design of Closed Ended Piles in Sand

Authors: Cormac Reale, Luke J. Prendergast, Kenneth Gavin

Abstract:

While significant improvements have been made in axial pile design methods over recent years, the influence of soils natural variability has not been adequately accounted for within them. Soil variability is a crucial parameter to consider as it can account for large variations in pile capacity across the same site. This paper seeks to address this knowledge deficit, by demonstrating how soil spatial variability can be accommodated into existing cone penetration test (CPT) based pile design methods, in the form of layered non-homogeneous random fields. These random fields model the scope of a given property’s variance and define how it varies spatially. A Monte Carlo analysis of the pile will be performed taking into account parameter uncertainty and spatial variability, described using the measured scales of fluctuation. The results will be discussed in light of Eurocode 7 and the effect of spatial averaging on design capacities will be analysed.

Keywords: pile axial design, reliability, spatial variability, CPT

Procedia PDF Downloads 246
11529 CFD Investigation of Turbulent Mixed Convection Heat Transfer in a Closed Lid-Driven Cavity

Authors: A. Khaleel, S. Gao

Abstract:

Both steady and unsteady turbulent mixed convection heat transfer in a 3D lid-driven enclosure, which has constant heat flux on the middle of bottom wall and with isothermal moving sidewalls, is reported in this paper for working fluid with Prandtl number Pr = 0.71. The other walls are adiabatic and stationary. The dimensionless parameters used in this research are Reynolds number, Re = 5000, 10000 and 15000, and Richardson number, Ri = 1 and 10. The simulations have been done by using different turbulent methods such as RANS, URANS, and LES. The effects of using different k- models such as standard, RNG and Realizable k- model are investigated. Interesting behaviours of the thermal and flow fields with changing the Re or Ri numbers are observed. Isotherm and turbulent kinetic energy distributions and variation of local Nusselt number at the hot bottom wall are studied as well. The local Nusselt number is found increasing with increasing either Re or Ri number. In addition, the turbulent kinetic energy is discernibly affected by increasing Re number. Moreover, the LES results have shown a good ability of this method in predicting more detailed flow structures in the cavity.

Keywords: mixed convection, lid-driven cavity, turbulent flow, RANS model, large Eddy simulation

Procedia PDF Downloads 211
11528 A Remote Sensing Approach to Calculate Population Using Roads Network Data in Lebanon

Authors: Kamel Allaw, Jocelyne Adjizian Gerard, Makram Chehayeb, Nada Badaro Saliba

Abstract:

In developing countries, such as Lebanon, the demographic data are hardly available due to the absence of the mechanization of population system. The aim of this study is to evaluate, using only remote sensing data, the correlations between the number of population and the characteristics of roads network (length of primary roads, length of secondary roads, total length of roads, density and percentage of roads and the number of intersections). In order to find the influence of the different factors on the demographic data, we studied the degree of correlation between each factor and the number of population. The results of this study have shown a strong correlation between the number of population and the density of roads and the number of intersections.

Keywords: population, road network, statistical correlations, remote sensing

Procedia PDF Downloads 163
11527 Location-Domination on Join of Two Graphs and Their Complements

Authors: Analen Malnegro, Gina Malacas

Abstract:

Dominating sets and related topics have been studied extensively in the past few decades. A dominating set of a graph G is a subset D of V such that every vertex not in D is adjacent to at least one member of D. The domination number γ(G) is the number of vertices in a smallest dominating set for G. Some problems involving detection devices can be modeled with graphs. Finding the minimum number of devices needed according to the type of devices and the necessity of locating the object gives rise to locating-dominating sets. A subset S of vertices of a graph G is called locating-dominating set, LD-set for short, if it is a dominating set and if every vertex v not in S is uniquely determined by the set of neighbors of v belonging to S. The location-domination number λ(G) is the minimum cardinality of an LD-set for G. The complement of a graph G is a graph Ḡ on same vertices such that two distinct vertices of Ḡ are adjacent if and only if they are not adjacent in G. An LD-set of a graph G is global if it is an LD-set of both G and its complement Ḡ. The global location-domination number λg(G) is defined as the minimum cardinality of a global LD-set of G. In this paper, global LD-sets on the join of two graphs are characterized. Global location-domination numbers of these graphs are also determined.

Keywords: dominating set, global locating-dominating set, global location-domination number, locating-dominating set, location-domination number

Procedia PDF Downloads 184
11526 Voxel Models as Input for Heat Transfer Simulations with Siemens NX Based on X-Ray Microtomography Images of Random Fibre Reinforced Composites

Authors: Steven Latré, Frederik Desplentere, Ilya Straumit, Stepan V. Lomov

Abstract:

A method is proposed in order to create a three-dimensional finite element model representing fibre reinforced insulation materials for the simulation software Siemens NX. VoxTex software, a tool for quantification of µCT images of fibrous materials, is used for the transformation of microtomography images of random fibre reinforced composites into finite element models. An automatic tool was developed to execute the import of the models to the thermal solver module of Siemens NX. The paper describes the numerical tools used for the image quantification and the transformation and illustrates them on several thermal simulations of fibre reinforced insulation blankets filled with low thermal conductive fillers. The calculation of thermal conductivity is validated by comparison with the experimental data.

Keywords: analysis, modelling, thermal, voxel

Procedia PDF Downloads 287
11525 Reduced Power Consumption by Randomization for DSI3

Authors: David Levy

Abstract:

The newly released Distributed System Interface 3 (DSI3) Bus Standard specification defines 3 modulation levels from which 16 valid symbols are coded. This structure creates power consumption variations depending on the transmitted data of a factor of more than 2 between minimum and maximum. The power generation unit has to consider therefore the worst case maximum consumption all the time and be built accordingly. This paper proposes a method to reduce both the average current consumption and worst case current consumption. The transmitter randomizes the data using several pseudo-random sequences. It then estimates the energy consumption of the generated frames and selects to transmit the one which consumes the least. The transmitter also prepends the index of the pseudo-random sequence, which is not randomized, to allow the receiver to recover the original data using the correct sequence. We show that in the case that the frame occupies most of the DSI3 synchronization period, we achieve average power consumption reduction by up to 13% and the worst case power consumption is reduced by 17.7%.

Keywords: DSI3, energy, power consumption, randomization

Procedia PDF Downloads 538
11524 The Norm, Singular Value and Condition Number Analysis for the Hadamard Matrices

Authors: Emine Tuğba Akyüz

Abstract:

In this study, the analysis of Hadamard matrices, which is a special type of matrix, was made under three headings: norms, singular values, condition number. Six norm types was applied to Hadamard matrices and the relationship between the results and the size of the matrix has been studied. As a result of the investigation when 2-norm was used on the problem Hx =f, the equation ‖x‖_2= ‖f‖_2/√n was shown (H is n-dimensional Hadamard matrix). Related with this, the relationship between the the singular value of H and 2-norm and eigenvalues was shown. Then, the evaluation of condition number for Hx =f was made.

Keywords: condition number, Hadamard matrix, norm, singular value

Procedia PDF Downloads 343
11523 Dental Ethics versus Malpractice, as Phenomenon with a Growing Trend

Authors: Saimir Heta, Kers Kapaj, Rialda Xhizdari, Ilma Robo

Abstract:

Dealing with emerging cases of dental malpractice with justifications that stem from the clear rules of dental ethics is a phenomenon with an increasing trend in today's dental practice. Dentists should clearly understand how far the limit of malpractice goes, with or without minimal or major consequences, for the affected patient, which can be justified as a complication of dental treatment, in support of the rules of dental ethics in the dental office. Indeed, malpractice can occur in cases of lack of professionalism, but it can also come as a consequence of anatomical and physiological limitations in the implementation of the dental protocols, predetermined and indicated by the patient in the paragraph of the treatment plan in his personal card. This study is of the review type with the aim of the latest findings published in the literature about the problem of dealing with these phenomena. The combination of keywords is done in such a way with the aim to give the necessary space for collecting the right information in the networks of publications about this field, always first from the point of view of the dentist and not from that of the lawyer or jurist. From the findings included in this article, it was noticed the diversity of approaches towards the phenomenon depends on the different countries based on the legal basis that these countries have. There is a lack of or a small number of articles that touch on this topic, and these articles are presented with a limited number of data on the same topic. Conclusions: Dental malpractice should not be hidden under the guise of various dental complications that we justify with the strict rules of ethics for patients treated in the dental chair. The individual experience of dental malpractice must be published with the aim of serving as a source of experience for future generations of dentists.

Keywords: dental ethics, malpractice, professional protocol, random deviation

Procedia PDF Downloads 97
11522 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays

Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal

Abstract:

Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).

Keywords: fault tolerance, FPGA, single event upset, approximate computing

Procedia PDF Downloads 199
11521 Setting the Acceleration Test Conditions for Establishing the Expiration Date of Probiotics

Authors: Myoyeon Kim

Abstract:

The number of probiotics is various from product to product. The product must contain as many bacteria as the number of bacteria that claim because it greatly affects consumers' choices. It is very difficult to determine the number of viable bacteria with tests that proceed during the product development stage because the shelf life of lactic acid bacteria is mostly 18 to 24 months, and product development proceeds much faster than this. To predict the shelf life, a method of checking the number of viable bacteria was studied by shortening the time. The experiment was conducted with a total of 7 products including our products. The ongoing test stored at room temperature, the acceleration test stored at 30°C and 40°C were performed, and the number of bacteria was measured every two weeks. The number of viable bacteria stored at 30°C for 12 weeks was similar to the ongoing test when the shelf life was imminent. If it took more than 12 weeks, the product development schedule was postponed, so acceleration had no meaning. It was found that products stored at 40°C were unsuitable as acceleration test temperatures because the bacteria were almost killed within 4 to 8 weeks.

Keywords: probiotics, shelf-life, acceleration test, lactobacillus

Procedia PDF Downloads 38
11520 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images

Authors: Mehrnoosh Omati, Mahmod Reza Sahebi

Abstract:

The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.

Keywords: coupled Markov random field (MRF), environment, object-based analysis, polarimetric SAR (PolSAR) images

Procedia PDF Downloads 219
11519 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 128
11518 Numerical Study of Rayleight Number and Eccentricity Effect on Free Convection Fluid Flow and Heat Transfer of Annulus

Authors: Ali Reza Tahavvor‚ Saeed Hosseini, Behnam Amiri

Abstract:

Concentric and eccentric annulus is used frequently in technical and industrial applications such as nuclear reactors, thermal storage system and etc. In this paper, computational fluid dynamics (CFD) is used to investigate two dimensional free convection of laminar flow in annulus with isotherm cylinders surface and cooler inner surface. Problem studied in thirty different cases. Due to natural convection continuity and momentum equations are coupled and must be solved simultaneously. Finite volume method is used for solving governing equations. The purpose was to obtain the eccentricity effect on Nusselt number in different Rayleight numbers, so streamlines and temperature fields must be determined. Results shown that the highest Nusselt number values occurs in degree of eccentricity equal to 0.5 upward for inner cylinder and degree of eccentricity equal to 0.3 upward for outer cylinder. Side eccentricity reduces the outer cylinder Nusselt number but increases inner cylinder Nusselt number. The trend in variation of Nusselt number with respect to eccentricity remain similar in different Rayleight numbers. Correlations are included to calculate the Nusselt number of the cylinders.

Keywords: natural convection, concentric, eccentric, Nusselt number, annulus

Procedia PDF Downloads 373
11517 Structural Reliability Analysis Using Extreme Learning Machine

Authors: Mehul Srivastava, Sharma Tushar Ravikant, Mridul Krishn Mishra

Abstract:

In structural design, the evaluation of safety and probability failure of structure is of significant importance, mainly when the variables are random. On real structures, structural reliability can be evaluated obtaining an implicit limit state function. The structural reliability limit state function is obtained depending upon the statistically independent variables. In the analysis of reliability, we considered the statistically independent random variables to be the load intensity applied and the depth or height of the beam member considered. There are many approaches for structural reliability problems. In this paper Extreme Learning Machine technique and First Order Second Moment Method is used to determine the reliability indices for the same set of variables. The reliability index obtained using ELM is compared with the reliability index obtained using FOSM. Higher the reliability index, more feasible is the method to determine the reliability.

Keywords: reliability, reliability index, statistically independent, extreme learning machine

Procedia PDF Downloads 684
11516 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: channel estimation, inter-cell interference, pilot contamination attacks, wireless communications

Procedia PDF Downloads 217
11515 Breast Cancer Detection Using Machine Learning Algorithms

Authors: Jiwan Kumar, Pooja, Sandeep Negi, Anjum Rouf, Amit Kumar, Naveen Lakra

Abstract:

In modern times where, health issues are increasing day by day, breast cancer is also one of them, which is very crucial and really important to find in the early stages. Doctors can use this model in order to tell their patients whether a cancer is not harmful (benign) or harmful (malignant). We have used the knowledge of machine learning in order to produce the model. we have used algorithms like Logistic Regression, Random forest, support Vector Classifier, Bayesian Network and Radial Basis Function. We tried to use the data of crucial parts and show them the results in pictures in order to make it easier for doctors. By doing this, we're making ML better at finding breast cancer, which can lead to saving more lives and better health care.

Keywords: Bayesian network, radial basis function, ensemble learning, understandable, data making better, random forest, logistic regression, breast cancer

Procedia PDF Downloads 54
11514 Empirical Study on Factors Influencing SEO

Authors: Pakinee Aimmanee, Phoom Chokratsamesiri

Abstract:

Search engine has become an essential tool nowadays for people to search for their needed information on the internet. In this work, we evaluate the performance of the search engine from three factors: the keyword frequency, the number of inbound links, and the difficulty of the keyword. The evaluations are based on the ranking position and the number of days that Google has seen or detect the webpage. We find that the keyword frequency and the difficulty of the keyword do not affect the Google ranking where the number of inbound links gives remarkable improvement of the ranking position. The optimal number of inbound links found in the experiment is 10.

Keywords: SEO, information retrieval, web search, knowledge technologies

Procedia PDF Downloads 283
11513 Crack Growth Life Prediction of a Fighter Aircraft Wing Splice Joint Under Spectrum Loading Using Random Forest Regression and Artificial Neural Networks with Hyperparameter Optimization

Authors: Zafer Yüce, Paşa Yayla, Alev Taşkın

Abstract:

There are heaps of analytical methods to estimate the crack growth life of a component. Soft computing methods have an increasing trend in predicting fatigue life. Their ability to build complex relationships and capability to handle huge amounts of data are motivating researchers and industry professionals to employ them for challenging problems. This study focuses on soft computing methods, especially random forest regressors and artificial neural networks with hyperparameter optimization algorithms such as grid search and random grid search, to estimate the crack growth life of an aircraft wing splice joint under variable amplitude loading. TensorFlow and Scikit-learn libraries of Python are used to build the machine learning models for this study. The material considered in this work is 7050-T7451 aluminum, which is commonly preferred as a structural element in the aerospace industry, and regarding the crack type; corner crack is used. A finite element model is built for the joint to calculate fastener loads and stresses on the structure. Since finite element model results are validated with analytical calculations, findings of the finite element model are fed to AFGROW software to calculate analytical crack growth lives. Based on Fighter Aircraft Loading Standard for Fatigue (FALSTAFF), 90 unique fatigue loading spectra are developed for various load levels, and then, these spectrums are utilized as inputs to the artificial neural network and random forest regression models for predicting crack growth life. Finally, the crack growth life predictions of the machine learning models are compared with analytical calculations. According to the findings, a good correlation is observed between analytical and predicted crack growth lives.

Keywords: aircraft, fatigue, joint, life, optimization, prediction.

Procedia PDF Downloads 178
11512 Effect of Prandtl Number on Flow and Heat Transfer Across a Confined Equilateral Triangular Cylinder

Authors: Tanveer Rasool, A. K. Dhiman

Abstract:

The paper reports 2-D numerical study used to investigate the effect of changing working fluids with Prandtl numbers 0.71, 10 and 50 on the flow and convective heat transfer across an equilateral triangular cylinder placed in a horizontal channel with its apex facing the flow. Numerical results have been generated for fixed blockage ratio of 50% and for three Reynolds numbers of 50, 75, and 100 for each Prandtl numbers respectively. The studies show that for above range of Reynolds numbers, the overall drag coefficient is insensitive to the Prandtl number changes while as the heat transfer characteristics change drastically with changing Prandtl number of the working fluid. The results generated are in complete agreement with the previous literature available.

Keywords: Prandtl number, Reynolds number, drag coefficient, flow and isothermal patterns

Procedia PDF Downloads 399
11511 Performance and Emission Prediction in a Biodiesel Engine Fuelled with Honge Methyl Ester Using RBF Neural Networks

Authors: Shiva Kumar, G. S. Vijay, Srinivas Pai P., Shrinivasa Rao B. R.

Abstract:

In the present study RBF neural networks were used for predicting the performance and emission parameters of a biodiesel engine. Engine experiments were carried out in a 4 stroke diesel engine using blends of diesel and Honge methyl ester as the fuel. Performance parameters like BTE, BSEC, Tech and emissions from the engine were measured. These experimental results were used for ANN modeling. RBF center initialization was done by random selection and by using Clustered techniques. Network was trained by using fixed and varying widths for the RBF units. It was observed that RBF results were having a good agreement with the experimental results. Networks trained by using clustering technique gave better results than using random selection of centers in terms of reduced MRE and increased prediction accuracy. The average MRE for the performance parameters was 3.25% with the prediction accuracy of 98% and for emissions it was 10.4% with a prediction accuracy of 80%.

Keywords: radial basis function networks, emissions, performance parameters, fuzzy c means

Procedia PDF Downloads 560
11510 Magneto-Solutal Convection in Newtonian Fluid Layer with Modulated Gravity

Authors: Om Prakash Keshri, Anand Kumar, Vinod K. Gupta

Abstract:

In the present study, the effect of gravity modulation on the onset of convection in viscous fluid layer under the influence of induced magnetic field, salted from above on the boundaries, has been investigated. Linear and nonlinear stability analysis has been performed. A linear stability analysis is performed to show that the gravity modulation can significantly affect the stability limits of the system. A method based on small amplitude of the modulation is used to compute the critical value of Rayleigh number and wave number. The effect of Smith number, salute Rayleigh number and magnetic Prandtl number on the stability of the system is investigated.

Keywords: viscous fluid, induced magnetic field, gravity modulation, salute convection

Procedia PDF Downloads 191