Search results for: random number
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11539

Search results for: random number

11269 Numerical Study of Rayleight Number and Eccentricity Effect on Free Convection Fluid Flow and Heat Transfer of Annulus

Authors: Ali Reza Tahavvor‚ Saeed Hosseini, Behnam Amiri

Abstract:

Concentric and eccentric annulus is used frequently in technical and industrial applications such as nuclear reactors, thermal storage system and etc. In this paper, computational fluid dynamics (CFD) is used to investigate two dimensional free convection of laminar flow in annulus with isotherm cylinders surface and cooler inner surface. Problem studied in thirty different cases. Due to natural convection continuity and momentum equations are coupled and must be solved simultaneously. Finite volume method is used for solving governing equations. The purpose was to obtain the eccentricity effect on Nusselt number in different Rayleight numbers, so streamlines and temperature fields must be determined. Results shown that the highest Nusselt number values occurs in degree of eccentricity equal to 0.5 upward for inner cylinder and degree of eccentricity equal to 0.3 upward for outer cylinder. Side eccentricity reduces the outer cylinder Nusselt number but increases inner cylinder Nusselt number. The trend in variation of Nusselt number with respect to eccentricity remain similar in different Rayleight numbers. Correlations are included to calculate the Nusselt number of the cylinders.

Keywords: natural convection, concentric, eccentric, Nusselt number, annulus

Procedia PDF Downloads 344
11268 Implementation and Challenges of Assessment Methods in the Case of Physical Education Class in Some Selected Preparatory Schools of Kirkos Sub-City

Authors: Kibreab Alene Fenite

Abstract:

The purpose of this study is to investigate the implementation and challenges of different assessment methods for physical education class in some selected preparatory schools of kirkos sub city. The participants in this study are teachers, students, department heads and school principals from 4 selected schools. Of the total 8 schools offering in kirkos sub city 4 schools (Dandi Boru, Abiyot Kirse, Assay, and Adey Ababa) are selected by using simple random sampling techniques and from these schools all (100%) of teachers, 100% of department heads and school principals are taken as a sample as their number is manageable. From the total 2520 students, 252 (10%) of students are selected using simple random sampling. Accordingly, 13 teachers, 252 students, 4 department heads and 4 school principals are taken as a sample from the 4 selected schools purposefully. As a method of data gathering tools; questionnaire and interview are employed. To analyze the collected data, both quantitative and qualitative methods are used. The result of the study revealed that assessment in physical education does not implement properly: lack of sufficient materials, inadequate time allotment, large class size, and lack of collaboration and working together of teachers towards assessing the performance of students, absence of guidelines to assess the physical education subject, no different assessment method that is implementing on students with disabilities in line with their special need are found as major challenges in implementing the current assessment method of physical education. To overcome these problems the following recommendations have been forwarded. These are: the necessary facilities and equipment should be available; In order to make reliable, accurate, objective and relevant assessment, teachers of physical education should be familiarized with different assessment techniques; Physical education assessment guidelines should be prepared, and guidelines should include different types of assessment methods; qualified teachers should be employed, and different teaching room must be build.

Keywords: assessment, challenges, equipment, guidelines, implementation, performance

Procedia PDF Downloads 263
11267 An Efficient Acquisition Algorithm for Long Pseudo-Random Sequence

Authors: Wan-Hsin Hsieh, Chieh-Fu Chang, Ming-Seng Kao

Abstract:

In this paper, a novel method termed the Phase Coherence Acquisition (PCA) is proposed for pseudo-random (PN) sequence acquisition. By employing complex phasors, the PCA requires only complex additions in the order of N, the length of the sequence, whereas the conventional method utilizing fast Fourier transform (FFT) requires complex multiplications and additions both in the order of Nlog2N . In order to combat noise, the input and local sequences are partitioned and mapped into complex phasors in PCA. The phase differences between pairs of input and local phasors are utilized for acquisition, and thus complex multiplications are avoided. For more noise-robustness capability, the multi-layer PCA is developed to extract the code phase step by step. The significant reduction of computational loads makes the PCA an attractive method, especially when the sequence length of is extremely large which becomes intractable for the FFT-based acquisition.

Keywords: FFT, PCA, PN sequence, convolution theory

Procedia PDF Downloads 461
11266 Empirical Study on Factors Influencing SEO

Authors: Pakinee Aimmanee, Phoom Chokratsamesiri

Abstract:

Search engine has become an essential tool nowadays for people to search for their needed information on the internet. In this work, we evaluate the performance of the search engine from three factors: the keyword frequency, the number of inbound links, and the difficulty of the keyword. The evaluations are based on the ranking position and the number of days that Google has seen or detect the webpage. We find that the keyword frequency and the difficulty of the keyword do not affect the Google ranking where the number of inbound links gives remarkable improvement of the ranking position. The optimal number of inbound links found in the experiment is 10.

Keywords: SEO, information retrieval, web search, knowledge technologies

Procedia PDF Downloads 265
11265 Effect of Prandtl Number on Flow and Heat Transfer Across a Confined Equilateral Triangular Cylinder

Authors: Tanveer Rasool, A. K. Dhiman

Abstract:

The paper reports 2-D numerical study used to investigate the effect of changing working fluids with Prandtl numbers 0.71, 10 and 50 on the flow and convective heat transfer across an equilateral triangular cylinder placed in a horizontal channel with its apex facing the flow. Numerical results have been generated for fixed blockage ratio of 50% and for three Reynolds numbers of 50, 75, and 100 for each Prandtl numbers respectively. The studies show that for above range of Reynolds numbers, the overall drag coefficient is insensitive to the Prandtl number changes while as the heat transfer characteristics change drastically with changing Prandtl number of the working fluid. The results generated are in complete agreement with the previous literature available.

Keywords: Prandtl number, Reynolds number, drag coefficient, flow and isothermal patterns

Procedia PDF Downloads 371
11264 Dental Ethics versus Malpractice, as Phenomenon with a Growing Trend

Authors: Saimir Heta, Kers Kapaj, Rialda Xhizdari, Ilma Robo

Abstract:

Dealing with emerging cases of dental malpractice with justifications that stem from the clear rules of dental ethics is a phenomenon with an increasing trend in today's dental practice. Dentists should clearly understand how far the limit of malpractice goes, with or without minimal or major consequences, for the affected patient, which can be justified as a complication of dental treatment, in support of the rules of dental ethics in the dental office. Indeed, malpractice can occur in cases of lack of professionalism, but it can also come as a consequence of anatomical and physiological limitations in the implementation of the dental protocols, predetermined and indicated by the patient in the paragraph of the treatment plan in his personal card. This study is of the review type with the aim of the latest findings published in the literature about the problem of dealing with these phenomena. The combination of keywords is done in such a way with the aim to give the necessary space for collecting the right information in the networks of publications about this field, always first from the point of view of the dentist and not from that of the lawyer or jurist. From the findings included in this article, it was noticed the diversity of approaches towards the phenomenon depends on the different countries based on the legal basis that these countries have. There is a lack of or a small number of articles that touch on this topic, and these articles are presented with a limited number of data on the same topic. Conclusions: Dental malpractice should not be hidden under the guise of various dental complications that we justify with the strict rules of ethics for patients treated in the dental chair. The individual experience of dental malpractice must be published with the aim of serving as a source of experience for future generations of dentists.

Keywords: dental ethics, malpractice, professional protocol, random deviation

Procedia PDF Downloads 69
11263 Spatial Rank-Based High-Dimensional Monitoring through Random Projection

Authors: Chen Zhang, Nan Chen

Abstract:

High-dimensional process monitoring becomes increasingly important in many application domains, where usually the process distribution is unknown and much more complicated than the normal distribution, and the between-stream correlation can not be neglected. However, since the process dimension is generally much bigger than the reference sample size, most traditional nonparametric multivariate control charts fail in high-dimensional cases due to the curse of dimensionality. Furthermore, when the process goes out of control, the influenced variables are quite sparse compared with the whole dimension, which increases the detection difficulty. Targeting at these issues, this paper proposes a new nonparametric monitoring scheme for high-dimensional processes. This scheme first projects the high-dimensional process into several subprocesses using random projections for dimension reduction. Then, for every subprocess with the dimension much smaller than the reference sample size, a local nonparametric control chart is constructed based on the spatial rank test to detect changes in this subprocess. Finally, the results of all the local charts are fused together for decision. Furthermore, after an out-of-control (OC) alarm is triggered, a diagnostic framework is proposed. using the square-root LASSO. Numerical studies demonstrate that the chart has satisfactory detection power for sparse OC changes and robust performance for non-normally distributed data, The diagnostic framework is also effective to identify truly changed variables. Finally, a real-data example is presented to demonstrate the application of the proposed method.

Keywords: random projection, high-dimensional process control, spatial rank, sequential change detection

Procedia PDF Downloads 280
11262 Stock Price Prediction with 'Earnings' Conference Call Sentiment

Authors: Sungzoon Cho, Hye Jin Lee, Sungwhan Jeon, Dongyoung Min, Sungwon Lyu

Abstract:

Major public corporations worldwide use conference calls to report their quarterly earnings. These 'earnings' conference calls allow for questions from stock analysts. We investigated if it is possible to identify sentiment from the call script and use it to predict stock price movement. We analyzed call scripts from six companies, two each from Korea, China and Indonesia during six years 2011Q1 – 2017Q2. Random forest with Frequency-based sentiment scores using Loughran MacDonald Dictionary did better than control model with only financial indicators. When the stock prices went up 20 days from earnings release, our model predicted correctly 77% of time. When the model predicted 'up,' actual stock prices went up 65% of time. This preliminary result encourages us to investigate advanced sentiment scoring methodologies such as topic modeling, auto-encoder, and word2vec variants.

Keywords: earnings call script, random forest, sentiment analysis, stock price prediction

Procedia PDF Downloads 274
11261 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays

Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal

Abstract:

Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).

Keywords: fault tolerance, FPGA, single event upset, approximate computing

Procedia PDF Downloads 170
11260 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data

Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang

Abstract:

Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.

Keywords: biomarker, congenital heart defects, DNA methylation, random forest

Procedia PDF Downloads 139
11259 Nonlinear Vibration of FGM Plates Subjected to Acoustic Load in Thermal Environment Using Finite Element Modal Reduction Method

Authors: Hassan Parandvar, Mehrdad Farid

Abstract:

In this paper, a finite element modeling is presented for large amplitude vibration of functionally graded material (FGM) plates subjected to combined random pressure and thermal load. The material properties of the plates are assumed to vary continuously in the thickness direction by a simple power law distribution in terms of the volume fractions of the constituents. The material properties depend on the temperature whose distribution along the thickness can be expressed explicitly. The von Karman large deflection strain displacement and extended Hamilton's principle are used to obtain the governing system of equations of motion in structural node degrees of freedom (DOF) using finite element method. Three-node triangular Mindlin plate element with shear correction factor is used. The nonlinear equations of motion in structural degrees of freedom are reduced by using modal reduction method. The reduced equations of motion are solved numerically by 4th order Runge-Kutta scheme. In this study, the random pressure is generated using Monte Carlo method. The modeling is verified and the nonlinear dynamic response of FGM plates is studied for various values of volume fraction and sound pressure level under different thermal loads. Snap-through type behavior of FGM plates is studied too.

Keywords: nonlinear vibration, finite element method, functionally graded material (FGM) plates, snap-through, random vibration, thermal effect

Procedia PDF Downloads 245
11258 Fast and Robust Long-term Tracking with Effective Searching Model

Authors: Thang V. Kieu, Long P. Nguyen

Abstract:

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Keywords: correlation filter, long-term tracking, random fern, real-time tracking

Procedia PDF Downloads 119
11257 Magneto-Solutal Convection in Newtonian Fluid Layer with Modulated Gravity

Authors: Om Prakash Keshri, Anand Kumar, Vinod K. Gupta

Abstract:

In the present study, the effect of gravity modulation on the onset of convection in viscous fluid layer under the influence of induced magnetic field, salted from above on the boundaries, has been investigated. Linear and nonlinear stability analysis has been performed. A linear stability analysis is performed to show that the gravity modulation can significantly affect the stability limits of the system. A method based on small amplitude of the modulation is used to compute the critical value of Rayleigh number and wave number. The effect of Smith number, salute Rayleigh number and magnetic Prandtl number on the stability of the system is investigated.

Keywords: viscous fluid, induced magnetic field, gravity modulation, salute convection

Procedia PDF Downloads 169
11256 The Effect of Spatial Variability on Axial Pile Design of Closed Ended Piles in Sand

Authors: Cormac Reale, Luke J. Prendergast, Kenneth Gavin

Abstract:

While significant improvements have been made in axial pile design methods over recent years, the influence of soils natural variability has not been adequately accounted for within them. Soil variability is a crucial parameter to consider as it can account for large variations in pile capacity across the same site. This paper seeks to address this knowledge deficit, by demonstrating how soil spatial variability can be accommodated into existing cone penetration test (CPT) based pile design methods, in the form of layered non-homogeneous random fields. These random fields model the scope of a given property’s variance and define how it varies spatially. A Monte Carlo analysis of the pile will be performed taking into account parameter uncertainty and spatial variability, described using the measured scales of fluctuation. The results will be discussed in light of Eurocode 7 and the effect of spatial averaging on design capacities will be analysed.

Keywords: pile axial design, reliability, spatial variability, CPT

Procedia PDF Downloads 220
11255 Effects of Roughness Elements on Heat Transfer During Natural Convection

Authors: M. Yousaf, S. Usman

Abstract:

The present study focused on the investigation of the effects of roughness elements on heat transfer during natural convection in a rectangular cavity using a numerical technique. Roughness elements were introduced on the bottom hot wall with a normalized amplitude (A*/H) of 0.1. Thermal and hydrodynamic behavior was studied using a computational method based on Lattice Boltzmann method (LBM). Numerical studies were performed for a laminar natural convection in the range of Rayleigh number (Ra) from 103 to 106 for a rectangular cavity of aspect ratio (L/H) 2 with a fluid of Prandtl number (Pr) 1.0. The presence of the sinusoidal roughness elements caused a minimum to the maximum decrease in the heat transfer as 7% to 17% respectively compared to the smooth enclosure. The results are presented for mean Nusselt number (Nu), isotherms, and streamlines.

Keywords: natural convection, Rayleigh number, surface roughness, Nusselt number, Lattice Boltzmann method

Procedia PDF Downloads 517
11254 Conditions on Expressing a Matrix as a Sum of α-Involutions

Authors: Ric Joseph R. Murillo, Edna N. Gueco, Dennis I. Merino

Abstract:

Let F be C or R, where C and R are the set of complex numbers and real numbers, respectively, and n be a natural number. An n-by-n matrix A over the field F is called an α-involutory matrix or an α-involution if there exists an α in the field such that the square of the matrix is equal to αI, where I is the n-by-n identity matrix. If α is a complex number or a nonnegative real number, then an n-by-n matrix A over the field F can be written as a sum of n-by-n α-involutory matrices over the field F if and only if the trace of that matrix is an integral multiple of the square root of α. Meanwhile, if α is a negative real number, then a 2n-by-2n matrix A over R can be written as a sum of 2n-by-2n α-involutory matrices over R if and only the trace of the matrix is zero. Some other properties of α-involutory matrices are also determined

Keywords: α-involutory Matrices, sum of α-involutory Matrices, Trace, Matrix Theory

Procedia PDF Downloads 166
11253 Automatic Number Plate Recognition System Based on Deep Learning

Authors: T. Damak, O. Kriaa, A. Baccar, M. A. Ben Ayed, N. Masmoudi

Abstract:

In the last few years, Automatic Number Plate Recognition (ANPR) systems have become widely used in the safety, the security, and the commercial aspects. Forethought, several methods and techniques are computing to achieve the better levels in terms of accuracy and real time execution. This paper proposed a computer vision algorithm of Number Plate Localization (NPL) and Characters Segmentation (CS). In addition, it proposed an improved method in Optical Character Recognition (OCR) based on Deep Learning (DL) techniques. In order to identify the number of detected plate after NPL and CS steps, the Convolutional Neural Network (CNN) algorithm is proposed. A DL model is developed using four convolution layers, two layers of Maxpooling, and six layers of fully connected. The model was trained by number image database on the Jetson TX2 NVIDIA target. The accuracy result has achieved 95.84%.

Keywords: ANPR, CS, CNN, deep learning, NPL

Procedia PDF Downloads 284
11252 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: channel estimation, inter-cell interference, pilot contamination attacks, wireless communications

Procedia PDF Downloads 192
11251 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 89
11250 Voxel Models as Input for Heat Transfer Simulations with Siemens NX Based on X-Ray Microtomography Images of Random Fibre Reinforced Composites

Authors: Steven Latré, Frederik Desplentere, Ilya Straumit, Stepan V. Lomov

Abstract:

A method is proposed in order to create a three-dimensional finite element model representing fibre reinforced insulation materials for the simulation software Siemens NX. VoxTex software, a tool for quantification of µCT images of fibrous materials, is used for the transformation of microtomography images of random fibre reinforced composites into finite element models. An automatic tool was developed to execute the import of the models to the thermal solver module of Siemens NX. The paper describes the numerical tools used for the image quantification and the transformation and illustrates them on several thermal simulations of fibre reinforced insulation blankets filled with low thermal conductive fillers. The calculation of thermal conductivity is validated by comparison with the experimental data.

Keywords: analysis, modelling, thermal, voxel

Procedia PDF Downloads 270
11249 Reduced Power Consumption by Randomization for DSI3

Authors: David Levy

Abstract:

The newly released Distributed System Interface 3 (DSI3) Bus Standard specification defines 3 modulation levels from which 16 valid symbols are coded. This structure creates power consumption variations depending on the transmitted data of a factor of more than 2 between minimum and maximum. The power generation unit has to consider therefore the worst case maximum consumption all the time and be built accordingly. This paper proposes a method to reduce both the average current consumption and worst case current consumption. The transmitter randomizes the data using several pseudo-random sequences. It then estimates the energy consumption of the generated frames and selects to transmit the one which consumes the least. The transmitter also prepends the index of the pseudo-random sequence, which is not randomized, to allow the receiver to recover the original data using the correct sequence. We show that in the case that the frame occupies most of the DSI3 synchronization period, we achieve average power consumption reduction by up to 13% and the worst case power consumption is reduced by 17.7%.

Keywords: DSI3, energy, power consumption, randomization

Procedia PDF Downloads 513
11248 Formulation and Test of a Model to explain the Complexity of Road Accident Events in South Africa

Authors: Dimakatso Machetele, Kowiyou Yessoufou

Abstract:

Whilst several studies indicated that road accident events might be more complex than thought, we have a limited scientific understanding of this complexity in South Africa. The present project proposes and tests a more comprehensive metamodel that integrates multiple causality relationships among variables previously linked to road accidents. This was done by fitting a structural equation model (SEM) to the data collected from various sources. The study also fitted the GARCH Model (Generalized Auto-Regressive Conditional Heteroskedasticity) to predict the future of road accidents in the country. The analysis shows that the number of road accidents has been increasing since 1935. The road fatality rate follows a polynomial shape following the equation: y = -0.0114x²+1.2378x-2.2627 (R²=0.76) with y = death rate and x = year. This trend results in an average death rate of 23.14 deaths per 100,000 people. Furthermore, the analysis shows that the number of crashes could be significantly explained by the total number of vehicles (P < 0.001), number of registered vehicles (P < 0.001), number of unregistered vehicles (P = 0.003) and the population of the country (P < 0.001). As opposed to expectation, the number of driver licenses issued and total distance traveled by vehicles do not correlate significantly with the number of crashes (P > 0.05). Furthermore, the analysis reveals that the number of casualties could be linked significantly to the number of registered vehicles (P < 0.001) and total distance traveled by vehicles (P = 0.03). As for the number of fatal crashes, the analysis reveals that the total number of vehicles (P < 0.001), number of registered (P < 0.001) and unregistered vehicles (P < 0.001), the population of the country (P < 0.001) and the total distance traveled by vehicles (P < 0.001) correlate significantly with the number of fatal crashes. However, the number of casualties and again the number of driver licenses do not seem to determine the number of fatal crashes (P > 0.05). Finally, the number of crashes is predicted to be roughly constant overtime at 617,253 accidents for the next 10 years, with the worse scenario suggesting that this number may reach 1 896 667. The number of casualties was also predicted to be roughly constant at 93 531 overtime, although this number may reach 661 531 in the worst-case scenario. However, although the number of fatal crashes may decrease over time, it is forecasted to reach 11 241 fatal crashes within the next 10 years, with the worse scenario estimated at 19 034 within the same period. Finally, the number of fatalities is also predicted to be roughly constant at 14 739 but may also reach 172 784 in the worse scenario. Overall, the present study reveals the complexity of road accidents and allows us to propose several recommendations aimed to reduce the trend of road accidents, casualties, fatal crashes, and death in South Africa.

Keywords: road accidents, South Africa, statistical modelling, trends

Procedia PDF Downloads 140
11247 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images

Authors: Mehrnoosh Omati, Mahmod Reza Sahebi

Abstract:

The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.

Keywords: coupled Markov random field (MRF), environment, object-based analysis, polarimetric SAR (PolSAR) images

Procedia PDF Downloads 199
11246 Structural Reliability Analysis Using Extreme Learning Machine

Authors: Mehul Srivastava, Sharma Tushar Ravikant, Mridul Krishn Mishra

Abstract:

In structural design, the evaluation of safety and probability failure of structure is of significant importance, mainly when the variables are random. On real structures, structural reliability can be evaluated obtaining an implicit limit state function. The structural reliability limit state function is obtained depending upon the statistically independent variables. In the analysis of reliability, we considered the statistically independent random variables to be the load intensity applied and the depth or height of the beam member considered. There are many approaches for structural reliability problems. In this paper Extreme Learning Machine technique and First Order Second Moment Method is used to determine the reliability indices for the same set of variables. The reliability index obtained using ELM is compared with the reliability index obtained using FOSM. Higher the reliability index, more feasible is the method to determine the reliability.

Keywords: reliability, reliability index, statistically independent, extreme learning machine

Procedia PDF Downloads 656
11245 Some Yield Parameters of Wheat Genotypes

Authors: Shatha A. Yousif, Hatem Jasim, Ali R. Abas, Dheya P. Yousef

Abstract:

To study the effect of the cross direction in bead wheat, three hybrid combinations (Babyle 113 , Iratome), (Sawa , Tamose2) and (Al Hashymya Al Iraq) were tested for plant height, number of tillers/m, number of grains per spike, weight of grains per spike, 1000-grain weight and grain yield. The results revealed that the direction of the cross had significant effect the number of grain/spike, tillers/m and grain yields. Grain yield was positively and significantly correlated with 1000-grain weight, number of grains per spike and tillers. Depend on the result of heritability and genetic advance it was suggested that 1000-grain weight number of grains per spike and tillers should be given emphasis for future wheat yield improvement programs.

Keywords: correlation, genetic advance, heritability, wheat, yield traits

Procedia PDF Downloads 406
11244 Phone Number Spoofing Attack in VoLTE 4G

Authors: Joo-Hyung Oh

Abstract:

The number of service users of 4G VoLTE (voice over LTE) using LTE data networks is rapidly growing. VoLTE based on all-IP network enables clearer and higher-quality voice calls than 3G. It does, however, pose new challenges; a voice call through IP networks makes it vulnerable to security threats such as wiretapping and forged or falsified information. And in particular, stealing other users’ phone numbers and forging or falsifying call request messages from outgoing voice calls within VoLTE result in considerable losses that include user billing and voice phishing to acquaintances. This paper focuses on the threats of caller phone number spoofing in the VoLTE and countermeasure technology as safety measures for mobile communication networks.

Keywords: LTE, 4G, VoLTE, phone number spoofing

Procedia PDF Downloads 410
11243 Comparing Effects of Supervised Exercise Therapy versus Home-Based Exercise Therapy on Low Back Pain Severity, Muscle Strength and Anthropometric Parameters in Patients with Nonspecific Chronic Low Back Pain

Authors: Haleh Dadgostar, Faramarz Akbari, Hosien Vahid Tari, Masoud Solaymani-Dodaran, Mohammad Razi

Abstract:

Introduction: There are a number of exercises-protocols have been applied to improve low back pain. We compared the effect of supervised exercise therapy and home-based exercise therapy among patients with nonspecific chronic low back pain. Methods: 70 patients with nonspecific chronic low back pain were randomly (using a random number generator, excel) divided into two groups to compare the effects of two types of exercise therapy. After a common educational session to learn how to live with low back pain as well as to use core training protocols to strengthen the muscles, the subjects were randomly assigned to follow supervised exercise therapy (n = 31) or home-based exercise therapy (n = 34) for 20 weeks. Results: Although both types of exercise programs resulted in reduced pain, this factor decreased more significantly in supervised exercise program. All scores of fitness improved significantly in supervised exercise group. But only knee extensor strength score was increased in the home base exercise group. Conclusion: Comparing between two types of exercise, supervised group exercise showed more effective than the other one. Reduction in low back pain severity and improvement in muscle flexibility and strength can be more achieved by using a 20-week supervised exercise program compared to the home-based exercise program in patients with nonspecific chronic low back pain.

Keywords: low back pain, anthropometric parameters, supervised exercise therapy, home-based exercise therapy

Procedia PDF Downloads 302
11242 User-Awareness from Eye Line Tracing During Specification Writing to Improve Specification Quality

Authors: Yoshinori Wakatake

Abstract:

Many defects after the release of software packages are caused due to omissions of sufficient test items in test specifications. Poor test specifications are detected by manual review, which imposes a high human load. The prevention of omissions depends on the end-user awareness of test specification writers. If test specifications were written while envisioning the behavior of end-users, the number of omissions in test items would be greatly reduced. The paper pays attention to the point that writers who can achieve it differ from those who cannot in not only the description richness but also their gaze information. It proposes a method to estimate the degree of user-awareness of writers through the analysis of their gaze information when writing test specifications. We conduct an experiment to obtain the gaze information of a writer of the test specifications. Test specifications are automatically classified using gaze information. In this method, a Random Forest model is constructed for the classification. The classification is highly accurate. By looking at the explanatory variables which turn out to be important variables, we know behavioral features to distinguish test specifications of high quality from others. It is confirmed they are pupil diameter size and the number and the duration of blinks. The paper also investigates test specifications automatically classified with gaze information to discuss features in their writing ways in each quality level. The proposed method enables us to automatically classify test specifications. It also prevents test item omissions, because it reveals writing features that test specifications of high quality should satisfy.

Keywords: blink, eye tracking, gaze information, pupil diameter, quality improvement, specification document, user-awareness

Procedia PDF Downloads 45
11241 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall

Authors: Sanjib Kr Pal, S. Bhattacharyya

Abstract:

Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.

Keywords: conjugate heat transfer, mixed convection, nano fluid, wall waviness

Procedia PDF Downloads 239
11240 The Effectiveness and Accuracy of the Schulte Holt IOL Toric Calculator Processor in Comparison to Manually Input Data into the Barrett Toric IOL Calculator

Authors: Gabrielle Holt

Abstract:

This paper is looking to prove the efficacy of the Schulte Holt IOL Toric Calculator Processor (Schulte Holt ITCP). It has been completed using manually inputted data into the Barrett Toric Calculator and comparing the number of minutes taken to complete the Toric calculations, the number of errors identified during completion, and distractions during completion. It will then compare that data to the number of minutes taken for the Schulte Holt ITCP to complete also, using the Barrett method, as well as the number of errors identified in the Schulte Holt ITCP. The data clearly demonstrate a momentous advantage to the Schulte Holt ITCP and notably reduces time spent doing Toric Calculations, as well as reducing the number of errors. With the ever-growing number of cataract surgeries taking place around the world and the waitlists increasing -the Schulte Holt IOL Toric Calculator Processor may well demonstrate a way forward to increase the availability of ophthalmologists and ophthalmic staff while maintaining patient safety.

Keywords: Toric, toric lenses, ophthalmology, cataract surgery, toric calculations, Barrett

Procedia PDF Downloads 60