Search results for: probability and statistics.
571 Text Summarization for Oil and Gas Drilling Topic
Authors: Y. Y. Chen, O. M. Foong, S. P. Yong, Kurniawan Iwan
Abstract:
Information sharing and gathering are important in the rapid advancement era of technology. The existence of WWW has caused rapid growth of information explosion. Readers are overloaded with too many lengthy text documents in which they are more interested in shorter versions. Oil and gas industry could not escape from this predicament. In this paper, we develop an Automated Text Summarization System known as AutoTextSumm to extract the salient points of oil and gas drilling articles by incorporating statistical approach, keywords identification, synonym words and sentence-s position. In this study, we have conducted interviews with Petroleum Engineering experts and English Language experts to identify the list of most commonly used keywords in the oil and gas drilling domain. The system performance of AutoTextSumm is evaluated using the formulae of precision, recall and F-score. Based on the experimental results, AutoTextSumm has produced satisfactory performance with F-score of 0.81.
Keywords: Keyword's probability, synonym sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731570 Estimating Regression Parameters in Linear Regression Model with a Censored Response Variable
Authors: Jesus Orbe, Vicente Nunez-Anton
Abstract:
In this work we study the effect of several covariates X on a censored response variable T with unknown probability distribution. In this context, most of the studies in the literature can be located in two possible general classes of regression models: models that study the effect the covariates have on the hazard function; and models that study the effect the covariates have on the censored response variable. Proposals in this paper are in the second class of models and, more specifically, on least squares based model approach. Thus, using the bootstrap estimate of the bias, we try to improve the estimation of the regression parameters by reducing their bias, for small sample sizes. Simulation results presented in the paper show that, for reasonable sample sizes and censoring levels, the bias is always smaller for the new proposals.
Keywords: Censored response variable, regression, bias.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476569 An Exploratory Environment for Concurrency Control Algorithms
Authors: Jinhua Guo
Abstract:
Designing, implementing, and debugging concurrency control algorithms in a real system is a complex, tedious, and errorprone process. Further, understanding concurrency control algorithms and distributed computations is itself a difficult task. Visualization can help with both of these problems. Thus, we have developed an exploratory environment in which people can prototype and test various versions of concurrency control algorithms, study and debug distributed computations, and view performance statistics of distributed systems. In this paper, we describe the exploratory environment and show how it can be used to explore concurrency control algorithms for the interactive steering of distributed computations.Keywords: Consistency, Distributed Computing, InteractiveSteering, Simulation, Visualization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817568 Systemic Approach to Risk Measurement of Drainage Systems in Urban Areas
Authors: Jadwiga Królikowska, Andrzej Królikowski, Jarosław Bajer
Abstract:
The work delineates the threats of maladjustment of the capacity of rain canals, designed and built in the early 20th century, in connection to heavy rainfall, especially in summer. This is the cause of the so called 'urban floods.' It directly relates to fierce raise of paving in the cities. Resolving this problem requires a change in philosophy of draining the rainfall by wider use of retention, infiltration and usage of rainwater.
In systemic approach to managing the safety of urban drainage systems the risk, which is directly connected to safety failures, has been accepted as a measure. The risk level defines the probability of occurrence of losses greater than the ones forecast for a given time frame. The procedure of risk modelling, enabling its numeric analysis by using appropriate weights, is a significant issue in this paper.
Keywords: Drainage system, urban areas, risk measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697567 A Method for Modeling Multiple Antenna Channels
Authors: S. Rajabi, M. ArdebiliPoor, M. Shahabadi
Abstract:
In this paper we propose a method for modeling the correlation between the received signals by two or more antennas operating in a multipath environment. Considering the maximum excess delay in the channel being modeled, an elliptical region surrounding both transmitter and receiver antennas is produced. A number of scatterers are randomly distributed in this region and scatter the incoming waves. The amplitude and phase of incoming waves are computed and used to obtain statistical properties of the received signals. This model has the distinguishable advantage of being applicable for any configuration of antennas. Furthermore the common PDF (Probability Distribution Function) of received wave amplitudes for any pair of antennas can be calculated and used to produce statistical parameters of received signals.Keywords: MIMO (Multiple Input Multiple Output), SIMO (Single Input Multiple Output), GBSBEM (Geometrically Based Single Bounce Elliptical Model).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422566 Human Body Configuration using Bayesian Model
Authors: Rui. Zhang, Yiming. Pi
Abstract:
In this paper we present a novel approach for human Body configuration based on the Silhouette. We propose to address this problem under the Bayesian framework. We use an effective Model based MCMC (Markov Chain Monte Carlo) method to solve the configuration problem, in which the best configuration could be defined as MAP (maximize a posteriori probability) in Bayesian model. This model based MCMC utilizes the human body model to drive the MCMC sampling from the solution space. It converses the original high dimension space into a restricted sub-space constructed by the human model and uses a hybrid sampling algorithm. We choose an explicit human model and carefully select the likelihood functions to represent the best configuration solution. The experiments show that this method could get an accurate configuration and timesaving for different human from multi-views.Keywords: Bayesian framework, MCMC, model based, human body configuration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318565 Performance Analysis of M-Ary Pulse Position Modulation in Multihop Multiple Input Multiple Output-Free Space Optical System over Uncorrelated Gamma-Gamma Atmospheric Turbulence Channels
Authors: Hechmi Saidi, Noureddine Hamdi
Abstract:
The performance of Decode and Forward (DF) multihop Free Space Optical ( FSO) scheme deploying Multiple Input Multiple Output (MIMO) configuration under Gamma-Gamma (GG) statistical distribution, that adopts M-ary Pulse Position Modulation (MPPM) coding, is investigated. We have extracted exact and estimated values of Symbol-Error Rates (SERs) respectively. A closed form formula related to the Probability Density Function (PDF) is expressed for our designed system. Thanks to the use of DF multihop MIMO FSO configuration and MPPM signaling, atmospheric turbulence is combatted; hence the transmitted signal quality is improved.Keywords: FSO, MIMO, MIMO, multihop, DF, SER, GG channel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645564 Shot Detection Using Modified Dugad Model
Authors: Lenka Krulikovská, Jaroslav Polec
Abstract:
In this paper we present a modification to existed model of threshold for shot cut detection, which is able to adapt itself to the sequence statistics and operate in real time, because it use for calculation only previously evaluated frames. The efficiency of proposed modified adaptive threshold scheme was verified through extensive test experiment with several similarity metrics and achieved results were compared to the results reached by the original model. According to results proposed threshold scheme reached higher accuracy than existed original model.
Keywords: Abrupt cut, shot cut detection, adaptive threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533563 Parametric and Nonparametric Analysis of Breast Cancer Treatments
Authors: Chunling Cong, Chris.P.Tsokos
Abstract:
The objective of the present research manuscript is to perform parametric, nonparametric, and decision tree analysis to evaluate two treatments that are being used for breast cancer patients. Our study is based on utilizing real data which was initially used in “Tamoxifen with or without breast irradiation in women of 50 years of age or older with early breast cancer" [1], and the data is supplied to us by N.A. Ibrahim “Decision tree for competing risks survival probability in breast cancer study" [2]. We agree upon certain aspects of our findings with the published results. However, in this manuscript, we focus on relapse time of breast cancer patients instead of survival time and parametric analysis instead of semi-parametric decision tree analysis is applied to provide more precise recommendations of effectiveness of the two treatments with respect to reoccurrence of breast cancer.Keywords: decision tree, breast cancer treatments, parametricanalysis, non-parametric analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052562 On Identity Disclosure Risk Measurement for Shared Microdata
Authors: M. N. Huda, S. Yamada, N. Sonehara
Abstract:
Probability-based identity disclosure risk measurement may give the same overall risk for different anonymization strategy of the same dataset. Some entities in the anonymous dataset may have higher identification risks than the others. Individuals are more concerned about higher risks than the average and are more interested to know if they have a possibility of being under higher risk. A notation of overall risk in the above measurement method doesn-t indicate whether some of the involved entities have higher identity disclosure risk than the others. In this paper, we have introduced an identity disclosure risk measurement method that not only implies overall risk, but also indicates whether some of the members have higher risk than the others. The proposed method quantifies the overall risk based on the individual risk values, the percentage of the records that have a risk value higher than the average and how larger the higher risk values are compared to the average. We have analyzed the disclosure risks for different disclosure control techniques applied to original microdata and present the results.Keywords: Anonymization, microdata, disclosure risk, privacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365561 Optimal Location of the I/O Point in the Parking System
Authors: Jing Zhang, Jie Chen
Abstract:
In this paper, we deal with the optimal I/O point location in an automated parking system. In this system, the S/R machine (storage and retrieve machine) travels independently in vertical and horizontal directions. Based on the characteristics of the parking system and the basic principle of AS/RS system (Automated Storage and Retrieval System), we obtain the continuous model in units of time. For the single command cycle using the randomized storage policy, we calculate the probability density function for the system travel time and thus we develop the travel time model. And we confirm that the travel time model shows a good performance by comparing with discrete case. Finally in this part, we establish the optimal model by minimizing the expected travel time model and it is shown that the optimal location of the I/O point is located at the middle of the left-hand above corner.
Keywords: Parking system, optimal location, response time, S/R machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 679560 Predictive Factors of Exercise Behaviors of Junior High School Students in Chonburi Province
Authors: Tanida Julvanichpong
Abstract:
Exercise has been regarded as a necessary and important aspect to enhance physical performance and psychology health. Body weight statistics of students in junior high school students in Chonburi Province beyond a standard risk of obesity. Promoting exercise among Junior high school students in Chonburi Province, essential knowledge concerning factors influencing exercise is needed. Therefore, this study aims to (1) determine the levels of perceived exercise behavior, exercise behavior in the past, perceived barriers to exercise, perceived benefits of exercise, perceived self-efficacy to exercise, feelings associated with exercise behavior, influence of the family to exercise, influence of friends to exercise, and the perceived influence of the environment on exercise. (2) examine the predicting ability of each of the above factors while including personal factors (sex, educational level) for exercise behavior. Pender’s Health Promotion Model was used as a guide for the study. Sample included 652 students in junior high schools, Chonburi Provience. The samples were selected by Multi-Stage Random Sampling. Data Collection has been done by using self-administered questionnaires. Data were analyzed using descriptive statistics, Pearson’s product moment correlation coefficient, Eta, and stepwise multiple regression analysis. The research results showed that: 1. Perceived benefits of exercise, influence of teacher, influence of environmental, feelings associated with exercise behavior were at a high level. Influence of the family to exercise, exercise behavior, exercise behavior in the past, perceived self-efficacy to exercise and influence of friends were at a moderate level. Perceived barriers to exercise were at a low level. 2. Exercise behavior was positively significant related to perceived benefits of exercise, influence of the family to exercise, exercise behavior in the past, perceived self-efficacy to exercise, influence of friends, influence of teacher, influence of environmental and feelings associated with exercise behavior (p < .01, respectively) and was negatively significant related to educational level and perceived barriers to exercise (p < .01, respectively). Exercise behavior was significant related to sex (Eta = 0.243, p=.000). 3. Exercise behavior in the past, influence of the family to exercise significantly contributed 60.10 percent of the variance to the prediction of exercise behavior in male students (p < .01). Exercise behavior in the past, perceived self-efficacy to exercise, perceived barriers to exercise, and educational level significantly contributed 52.60 percent of the variance to the prediction of exercise behavior in female students (p < .01).
Keywords: Predictive factors, exercise behaviors, junior high school.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1178559 Bail-in Capital: The New Box
Authors: Manu Krishnan, Phil Jacoby
Abstract:
In this paper, we discuss the paradigm shift in bank capital from the “gone concern" to the “going concern" mindset. We then propose a methodology for pricing a product of this shift called Contingent Capital Notes (“CoCos"). The Merton Model can determine a price for credit risk by using the firm-s equity value as a call option on those assets. Our pricing methodology for CoCos also uses the credit spread implied by the Merton Model in a subsequent derivative form created by John Hull et al . Here, a market implied asset volatility is calculated by using observed market CDS spreads. This implied asset volatility is then used to estimate the probability of triggering a predetermined “contingency event" given the distanceto- trigger (DTT). The paper then investigates the effect of varying DTTs and recovery assumptions on the CoCo yield. We conclude with an investment rationale.Keywords: CoCo, Contingent capital, Bank Capital, Tier1 Capital
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545558 The Announcer Trainee Satisfaction by National Broadcasting and Telecommunications Commission of Thailand
Authors: Nareenad Panbun
Abstract:
The objective is to study the knowledge utilization from the participants of the announcer training program by National Broadcasting and Telecommunications Commission (NBTC). This study is a quantitative research based on surveys and self-answering questionnaires. The population of this study is 100 participants randomly chosen by non-probability sampling method. The results have shown that most of the participants were satisfied with the topics of general knowledge about the broadcasting and television business for 37 people representing 37%, followed by the topics of broadcasting techniques. The legal issues, consumer rights, television business ethics, and credibility of the media are, in addition to the media's role and responsibilities in society, the use of language for successful communication. Therefore, the communication language skills are the most important for all of the trainees and will also build up the image of the broadcasting center.
Keywords: Announcer training program, participant, requirements announced, theory of utilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 753557 A Novel Fuzzy Technique for Image Noise Reduction
Authors: Hamed Vahdat Nejad, Hameed Reza Pourreza, Hasan Ebrahimi
Abstract:
A new fuzzy filter is presented for noise reduction of images corrupted with additive noise. The filter consists of two stages. In the first stage, all the pixels of image are processed for determining noisy pixels. For this, a fuzzy rule based system associates a degree to each pixel. The degree of a pixel is a real number in the range [0,1], which denotes a probability that the pixel is not considered as a noisy pixel. In the second stage, another fuzzy rule based system is employed. It uses the output of the previous fuzzy system to perform fuzzy smoothing by weighting the contributions of neighboring pixel values. Experimental results are obtained to show the feasibility of the proposed filter. These results are also compared to other filters by numerical measure and visual inspection.Keywords: Additive noise, Fuzzy logic, Image processing, Noise reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2112556 Design of Bayesian MDS Sampling Plan Based on the Process Capability Index
Authors: Davood Shishebori, Mohammad Saber Fallah Nezhad, Sina Seifi
Abstract:
In this paper, a variable multiple dependent state (MDS) sampling plan is developed based on the process capability index using Bayesian approach. The optimal parameters of the developed sampling plan with respect to constraints related to the risk of consumer and producer are presented. Two comparison studies have been done. First, the methods of double sampling model, sampling plan for resubmitted lots and repetitive group sampling (RGS) plan are elaborated and average sample numbers of the developed MDS plan and other classical methods are compared. A comparison study between the developed MDS plan based on Bayesian approach and the exact probability distribution is carried out.
Keywords: MDS sampling plan, RGS plan, sampling plan for resubmitted lots, process capability index, average sample number, Bayesian approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1010555 Segmentation of Images through Clustering to Extract Color Features: An Application forImage Retrieval
Authors: M. V. Sudhamani, C. R. Venugopal
Abstract:
This paper deals with the application for contentbased image retrieval to extract color feature from natural images stored in the image database by segmenting the image through clustering. We employ a class of nonparametric techniques in which the data points are regarded as samples from an unknown probability density. Explicit computation of the density is avoided by using the mean shift procedure, a robust clustering technique, which does not require prior knowledge of the number of clusters, and does not constrain the shape of the clusters. A non-parametric technique for the recovery of significant image features is presented and segmentation module is developed using the mean shift algorithm to segment each image. In these algorithms, the only user set parameter is the resolution of the analysis and either gray level or color images are accepted as inputs. Extensive experimental results illustrate excellent performance.Keywords: Segmentation, Clustering, Image Retrieval, Features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460554 Spatial Analysis and Statistics for Zoning of Urban Areas
Authors: Benedetto Manganelli, Beniamino Murgante
Abstract:
The use of statistical data and of the neural networks, capable of elaborate a series of data and territorial info, have allowed the making of a model useful in the subdivision of urban places into homogeneous zone under the profile of a social, real estate, environmental and urbanist background of a city. The development of homogeneous zone has fiscal and urbanist advantages. The tools in the model proposed, able to be adapted to the dynamic changes of the city, allow the application of the zoning fast and dynamic.
Keywords: Homogeneous Urban Areas, Multidimensional Scaling, Neural Network, Real Estate Market, Urban Planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934553 Accreditation and Quality Assurance of Nigerian Universities: The Management Imperative
Authors: F. O Anugom
Abstract:
The general functions of the university amongst other things include teaching, research and community service. Universities are recognized as the apex of learning, accumulating and imparting knowledge and skills of all kinds to students to enable them to be productive, earn their living and to make optimum contributions to national development. This is equivalent to the production of human capital in the form of high level manpower needed to administer the educational society, be useful to the society and manage the economy. Quality has become a matter of major importance for university education in Nigeria. Accreditation is the systematic review of educational programs to ensure that acceptable standards of education, scholarship and infrastructure are being maintained. Accreditation ensures that institution maintain quality. The process is designed to determine whether or not an institution has met or exceeded the published standards for accreditation, and whether it is achieving its mission and stated purposes. Ensuring quality assurance in accreditation process falls in the hands of university management which justified the need for this study. This study examined accreditation and quality assurance: the management imperative. Three research questions and three hypotheses guided the study. The design was a correlation survey with a population of 2,893 university administrators out of which 578 Heads of department and Dean of faculties were sampled. The instrument for data collection was titled Programme Accreditation Exercise scale with high levels of reliability. The research questions were answered with Pearson ‘r’ statistics. T-test statistics was used to test the hypotheses. It was found among others that the quality of accredited programme depends on the level of funding of universities in Nigeria. It was also indicated that quality of programme accreditation and physical facilities of universities in Nigeria have high relationship. But it was also revealed that programme accreditation is positively related to staffing in Nigerian universities. Based on the findings of the study, the researcher recommend that academic administrators should be included in the team of those who ensure quality programs in the universities. Private sector partnership should be encouraged to fund programs to ensure quality of programme in the universities. Independent agencies should be engaged to monitor the activities of accreditation teams to avoid bias.
Keywords: Accreditation, quality assurance, NUC, physical facilities, staffing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927552 Mathematical Modeling for Dengue Transmission with the Effect of Season
Authors: R. Kongnuy., P. Pongsumpun
Abstract:
Mathematical models can be used to describe the transmission of disease. Dengue disease is the most significant mosquito-borne viral disease of human. It now a leading cause of childhood deaths and hospitalizations in many countries. Variations in environmental conditions, especially seasonal climatic parameters, effect to the transmission of dengue viruses the dengue viruses and their principal mosquito vector, Aedes aegypti. A transmission model for dengue disease is discussed in this paper. We assume that the human and vector populations are constant. We showed that the local stability is completely determined by the threshold parameter, 0 B . If 0 B is less than one, the disease free equilibrium state is stable. If 0 B is more than one, a unique endemic equilibrium state exists and is stable. The numerical results are shown for the different values of the transmission probability from vector to human populations.Keywords: Dengue disease, mathematical model, season, threshold parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2215551 Evolutionary Program Based Approach for Manipulator Grasping Color Objects
Authors: Y. Harold Robinson, M. Rajaram, Honey Raju
Abstract:
Image segmentation and color identification is an important process used in various emerging fields like intelligent robotics. A method is proposed for the manipulator to grasp and place the color object into correct location. The existing methods such as PSO, has problems like accelerating the convergence speed and converging to a local minimum leading to sub optimal performance. To improve the performance, we are using watershed algorithm and for color identification, we are using EPSO. EPSO method is used to reduce the probability of being stuck in the local minimum. The proposed method offers the particles a more powerful global exploration capability. EPSO methods can determine the particles stuck in the local minimum and can also enhance learning speed as the particle movement will be faster.Keywords: Color information, EPSO, hue, saturation, value (HSV), image segmentation, particle swarm optimization (PSO). Active Contour, GMM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581550 Reliability Based Optimal Design of Laterally Loaded Pile with Limited Residual Strain Energy Capacity
Authors: M. Movahedi Rad
Abstract:
In this study, a general approach to the reliability based limit analysis of laterally loaded piles is presented. In engineering practice the uncertainties play a very important role. The aim of this study is to evaluate the lateral load capacity of free-head and fixed-head long pile when plastic limit analysis is considered. In addition to the plastic limit analysis to control the plastic behaviour of the structure, uncertain bound on the complementary strain energy of the residual forces is also applied. This bound has significant effect for the load parameter. The solution to reliability-based problems is obtained by a computer program which is governed by the reliability index calculation.Keywords: Reliability, laterally loaded pile, residual strain energy, probability, limit analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903549 Enhanced Bidirectional Selection Sort
Authors: Jyoti Dua
Abstract:
An algorithm is a well-defined procedure that takes some input in the form of some values, processes them and gives the desired output. It forms the basis of many other algorithms such as searching, pattern matching, digital filters etc., and other applications have been found in database systems, data statistics and processing, data communications and pattern matching. This paper introduces algorithmic “Enhanced Bidirectional Selection” sort which is bidirectional, stable. It is said to be bidirectional as it selects two values smallest from the front and largest from the rear and assigns them to their appropriate locations thus reducing the number of passes by half the total number of elements as compared to selection sort.
Keywords: Bubble sort, cocktail sort, selection sort, heap sort.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2374548 3D Object Model Reconstruction Based on Polywogs Wavelet Network Parametrization
Authors: Mohamed Othmani, Yassine Khlifi
Abstract:
This paper presents a technique for compact three dimensional (3D) object model reconstruction using wavelet networks. It consists to transform an input surface vertices into signals,and uses wavelet network parameters for signal approximations. To prove this, we use a wavelet network architecture founded on several mother wavelet families. POLYnomials WindOwed with Gaussians (POLYWOG) wavelet families are used to maximize the probability to select the best wavelets which ensure the good generalization of the network. To achieve a better reconstruction, the network is trained several iterations to optimize the wavelet network parameters until the error criterion is small enough. Experimental results will shown that our proposed technique can effectively reconstruct an irregular 3D object models when using the optimized wavelet network parameters. We will prove that an accurateness reconstruction depends on the best choice of the mother wavelets.Keywords: 3D object, optimization, parametrization, Polywog wavelets, reconstruction, wavelet networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502547 Exponentiated Transmuted Weibull Distribution A Generalization of the Weibull Distribution
Authors: Abd El Hady N. Ebraheim
Abstract:
This paper introduces a new generalization of the two parameter Weibull distribution. To this end, the quadratic rank transmutation map has been used. This new distribution is named exponentiated transmuted Weibull (ETW) distribution. The ETW distribution has the advantage of being capable of modeling various shapes of aging and failure criteria. Furthermore, eleven lifetime distributions such as the Weibull, exponentiated Weibull, Rayleigh and exponential distributions, among others follow as special cases. The properties of the new model are discussed and the maximum likelihood estimation is used to estimate the parameters. Explicit expressions are derived for the quantiles. The moments of the distribution are derived, and the order statistics are examined.
Keywords: Exponentiated, Inversion Method, Maximum Likelihood Estimation, Transmutation Map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3474546 Active Contours with Prior Corner Detection
Authors: U.A.A. Niroshika, Ravinda G.N. Meegama
Abstract:
Deformable active contours are widely used in computer vision and image processing applications for image segmentation, especially in biomedical image analysis. The active contour or “snake" deforms towards a target object by controlling the internal, image and constraint forces. However, if the contour initialized with a lesser number of control points, there is a high probability of surpassing the sharp corners of the object during deformation of the contour. In this paper, a new technique is proposed to construct the initial contour by incorporating prior knowledge of significant corners of the object detected using the Harris operator. This new reconstructed contour begins to deform, by attracting the snake towards the targeted object, without missing the corners. Experimental results with several synthetic images show the ability of the new technique to deal with sharp corners with a high accuracy than traditional methods.Keywords: Active Contours, Image Segmentation, Harris Operator, Snakes
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281545 Integral Operators Related to Problems of Interface Dynamics
Authors: Pa Pa Lin
Abstract:
This research work is concerned with the eigenvalue problem for the integral operators which are obtained by linearization of a nonlocal evolution equation. The purpose of section II.A is to describe the nature of the problem and the objective of the project. The problem is related to the “stable solution" of the evolution equation which is the so-called “instanton" that describe the interface between two stable phases. The analysis of the instanton and its asymptotic behavior are described in section II.C by imposing the Green function and making use of a probability kernel. As a result , a classical Theorem which is important for an instanton is proved. Section III devoted to a study of the integral operators related to interface dynamics which concern the analysis of the Cauchy problem for the evolution equation with initial data close to different phases and different regions of space.
Keywords: Evolution, Green function, instanton, integral operators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234544 New Analysis Methods on Strict Avalanche Criterion of S-Boxes
Authors: Phyu Phyu Mar, Khin Maung Latt
Abstract:
S-boxes (Substitution boxes) are keystones of modern symmetric cryptosystems (block ciphers, as well as stream ciphers). S-boxes bring nonlinearity to cryptosystems and strengthen their cryptographic security. They are used for confusion in data security An S-box satisfies the strict avalanche criterion (SAC), if and only if for any single input bit of the S-box, the inversion of it changes each output bit with probability one half. If a function (cryptographic transformation) is complete, then each output bit depends on all of the input bits. Thus, if it were possible to find the simplest Boolean expression for each output bit in terms of the input bits, each of these expressions would have to contain all of the input bits if the function is complete. From some important properties of S-box, the most interesting property SAC (Strict Avalanche Criterion) is presented and to analyze this property three analysis methods are proposed.Keywords: S-boxes, cryptosystems, strict avalanche criterion, function, analysis methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3922543 Characteristics of Turbulent Round Jets in its Potential-Core Region
Authors: S. Sivakumar, Ravikiran Sangras, Vasudevan Raghavan
Abstract:
In this work, stationary hot-wire measurements are carried out to investigate the characteristics of a round free jet in its potential core region (0 ≤ x/d ≤ 10). Measurements are carried out on an incompressible round jet for a range of Reynolds numbers from 4000 to 8000, calculated based on the jet exit mean velocity and the nozzle diameter. The effect of flow velocity on the development characteristics of the jet in the core region is analyzed. Timeaveraged statistics, spectra of velocity and its higher order moments are presented and explained.Keywords: Contoured nozzle, hot-wire anemometer, Reynolds number, velocity fluctuations, velocity spectra.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4592542 The Modified Eigenface Method using Two Thresholds
Authors: Yan Ma, ShunBao Li
Abstract:
A new approach is adopted in this paper based on Turk and Pentland-s eigenface method. It was found that the probability density function of the distance between the projection vector of the input face image and the average projection vector of the subject in the face database, follows Rayleigh distribution. In order to decrease the false acceptance rate and increase the recognition rate, the input face image has been recognized using two thresholds including the acceptance threshold and the rejection threshold. We also find out that the value of two thresholds will be close to each other as number of trials increases. During the training, in order to reduce the number of trials, the projection vectors for each subject has been averaged. The recognition experiments using the proposed algorithm show that the recognition rate achieves to 92.875% whilst the average number of judgment is only 2.56 times.Keywords: Eigenface, Face Recognition, Threshold, Rayleigh Distribution, Feature Extraction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496