Search results for: poisson random measures
1395 Oxide Based Resistive Random Access Memory Device for High Density Non Volatile Memory Applications
Authors: Z. Fang, X. P. Wang, G. Q. Lo, D. L. Kwong
Abstract:
In this work, we demonstrated vertical RRAM device fabricated at the sidewall of contact hole structures for possible future 3-D stacking integrations. The fabricated devices exhibit polarity dependent bipolar resistive switching with small operation voltage of less than 1V for both set and reset process. A good retention of memory window ~50 times is maintained after 1000s voltage bias.
Keywords: Bipolar switching, non volatile memory, resistive random access memory, 3-D stacking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22001394 On Speeding Up Support Vector Machines: Proximity Graphs Versus Random Sampling for Pre-Selection Condensation
Authors: Xiaohua Liu, Juan F. Beltran, Nishant Mohanchandra, Godfried T. Toussaint
Abstract:
Support vector machines (SVMs) are considered to be the best machine learning algorithms for minimizing the predictive probability of misclassification. However, their drawback is that for large data sets the computation of the optimal decision boundary is a time consuming function of the size of the training set. Hence several methods have been proposed to speed up the SVM algorithm. Here three methods used to speed up the computation of the SVM classifiers are compared experimentally using a musical genre classification problem. The simplest method pre-selects a random sample of the data before the application of the SVM algorithm. Two additional methods use proximity graphs to pre-select data that are near the decision boundary. One uses k-Nearest Neighbor graphs and the other Relative Neighborhood Graphs to accomplish the task.Keywords: Machine learning, data mining, support vector machines, proximity graphs, relative-neighborhood graphs, k-nearestneighbor graphs, random sampling, training data condensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19191393 Improved Segmentation of Speckled Images Using an Arithmetic-to-Geometric Mean Ratio Kernel
Abstract:
In this work, we improve a previously developed segmentation scheme aimed at extracting edge information from speckled images using a maximum likelihood edge detector. The scheme was based on finding a threshold for the probability density function of a new kernel defined as the arithmetic mean-to-geometric mean ratio field over a circular neighborhood set and, in a general context, is founded on a likelihood random field model (LRFM). The segmentation algorithm was applied to discriminated speckle areas obtained using simple elliptic discriminant functions based on measures of the signal-to-noise ratio with fractional order moments. A rigorous stochastic analysis was used to derive an exact expression for the cumulative density function of the probability density function of the random field. Based on this, an accurate probability of error was derived and the performance of the scheme was analysed. The improved segmentation scheme performed well for both simulated and real images and showed superior results to those previously obtained using the original LRFM scheme and standard edge detection methods. In particular, the false alarm probability was markedly lower than that of the original LRFM method with oversegmentation artifacts virtually eliminated. The importance of this work lies in the development of a stochastic-based segmentation, allowing an accurate quantification of the probability of false detection. Non visual quantification and misclassification in medical ultrasound speckled images is relatively new and is of interest to clinicians.Keywords: Discriminant function, false alarm, segmentation, signal-to-noise ratio, skewness, speckle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16551392 Proposal of a Model Supporting Decision-Making Based On Multi-Objective Optimization Analysis on Information Security Risk Treatment
Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.
Keywords: Information security risk treatment, Selection of risk measures, Risk acceptanceand Multi-objective optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17211391 Multi-Objective Random Drift Particle Swarm Optimization Algorithm Based on RDPSO and Crowding Distance Sorting
Authors: Yiqiong Yuan, Jun Sun, Dongmei Zhou, Jianan Sun
Abstract:
In this paper, we presented a Multi-Objective Random Drift Particle Swarm Optimization algorithm (MORDPSO-CD) based on RDPSO and crowding distance sorting to improve the convergence and distribution with less computation cost. MORDPSO-CD makes the most of RDPSO to approach the true Pareto optimal solutions fast. We adopt the crowding distance sorting technique to update and maintain the archived optimal solutions. Introducing the crowding distance technique into MORDPSO can make the leader particles find the true Pareto solution ultimately. The simulation results reveal that the proposed algorithm has better convergence and distribution.Keywords: Multi-objective optimization, random drift particle swarm optimization, crowding distance, Pareto optimal solution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14701390 Study of Measures to Secure Video Phone Service Safety through a Preliminary Evaluationof the Information Security of the New IT Service
Authors: DongHoon Shin, Yunmook Nah, HoSeong Kim, Gang Shin Lee, Jae-Il Lee
Abstract:
The rapid advance of communication technology is evolving the network environment into the broadband convergence network. Likewise, the IT services operated in the individual network are also being quickly converged in the broadband convergence network environment. VoIP and IPTV are two examples of such new services. Efforts are being made to develop the video phone service, which is an advanced form of the voice-oriented VoIP service. However, the new IT services will be subject to stability and reliability vulnerabilities if the relevant security issues are not answered during the convergence of the existing IT services currently being operated in individual networks within the wider broadband network environment. To resolve such problems, this paper attempts to analyze the possible threats and identify the necessary security measures before the deployment of the new IT services. Furthermore, it measures the quality of the encryption algorithm application example to describe the appropriate algorithm in order to present security technology that will have no negative impact on the quality of the video phone service.Keywords: BcN, Security Measures, Video Phone.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14441389 An Embedded System Design for SRAM SEU Test
Authors: Kyoung Kun Lee, Soongyu Kwon, Jong Tae Kim
Abstract:
An embedded system for SEU(single event upset) test needs to be designed to prevent system failure by high-energy particles during measuring SEU. SEU is a phenomenon in which the data is changed temporary in semiconductor device caused by high-energy particles. In this paper, we present an embedded system for SRAM(static random access memory) SEU test. SRAMs are on the DUT(device under test) and it is separated from control board which manages the DUT and measures the occurrence of SEU. It needs to have considerations for preventing system failure while managing the DUT and making an accurate measurement of SEUs. We measure the occurrence of SEUs from five different SRAMs at three different cyclotron beam energies 30, 35, and 40MeV. The number of SEUs of SRAMs ranges from 3.75 to 261.00 in average.Keywords: embedded system, single event upset, SRAM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16691388 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising
Authors: Jianwei Ma, Diriba Gemechu
Abstract:
In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.Keywords: Anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, Split Bregman Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10161387 Challenges in Anti-Counterfeiting of Cyber-Physical Systems
Authors: Daniel Kliewe, Arno Kühn, Roman Dumitrescu, Jürgen Gausemeier
Abstract:
This paper examines the system protection for cyber-physical systems (CPS). CPS are particularly characterized by their networking system components. This means they are able to adapt to the needs of their users and its environment. With this ability, CPS have new, specific requirements on the protection against anti-counterfeiting, know-how loss and manipulation. They increase the requirements on system protection because piracy attacks can be more diverse, for example because of an increasing number of interfaces or through the networking abilities. The new requirements were identified and in a next step matched with existing protective measures. Due to the found gap the development of new protection measures has to be forced to close this gap. Moreover a comparison of the effectiveness between selected measures was realized and the first results are presented in this paper.Keywords: Anti-counterfeiting, cyber physical systems, Intellectual property (IP) and knowledge management, system protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20311386 A Sequential Approach to Random-Effects Meta-Analysis
Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya
Abstract:
The objective of meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence base for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research significantly changed over time and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable only to fixed effect model (FEM) of meta-analysis. For random-effects model (REM), the analysis incorporates the heterogeneity variance, τ 2 and its estimation create complications. In this paper we study the use of a truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring in REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of applications.
Keywords: Meta-analysis, random-effects model, sequential testing, temporal changes in effect sizes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24241385 Memory Effects in Randomly Perturbed Nematic Liquid Crystals
Authors: Amid Ranjkesh, Milan Ambrožič, Samo Kralj
Abstract:
We study the typical domain size and configuration character of a randomly perturbed system exhibiting continuous symmetry breaking. As a model system we use rod-like objects within a cubic lattice interacting via a Lebwohl–Lasher-type interaction. We describe their local direction with a headless unit director field. An example of such systems represents nematic LC or nanotubes. We further introduce impurities of concentration p, which impose the random anisotropy field-type disorder to directors. We study the domain-type pattern of molecules as a function of p, anchoring strength w between a neighboring director and impurity, temperature, history of samples. In simulations we quenched the directors either from the random or homogeneous initial configuration. Our results show that a history of system strongly influences: i) the average domain coherence length; and ii) the range of ordering in the system. In the random case the obtained order is always short ranged (SR). On the contrary, in the homogeneous case, SR is obtained only for strong enough anchoring and large enough concentration p. In other cases, the ordering is either of quasi long range (QLR) or of long range (LR). We further studied memory effects for the random initial configuration. With increasing external ordering field B either QLR or LR is realized.Keywords: Lebwohl-Lasher model, liquid crystals, disorder, memory effect, orientational order.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15091384 Relevant LMA Features for Human Motion Recognition
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets.Keywords: Human motion recognition, Discriminative LMA features, random forest, features reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7731383 New Product-Type Estimators for the Population Mean Using Quartiles of the Auxiliary Variable
Authors: Amer Ibrahim Falah Al-Omari
Abstract:
In this paper, we suggest new product-type estimators for the population mean of the variable of interest exploiting the first or the third quartile of the auxiliary variable. We obtain mean square error equations and the bias for the estimators. We study the properties of these estimators using simple random sampling (SRS) and ranked set sampling (RSS) methods. It is found that, SRS and RSS produce approximately unbiased estimators of the population mean. However, the RSS estimators are more efficient than those obtained using SRS based on the same number of measured units for all values of the correlation coefficient.
Keywords: Product estimator, auxiliary variable, simple random sampling, extreme ranked set sampling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15311382 Statistical Analysis of First Order Plus Dead-time System using Operational Matrix
Authors: Pham Luu Trung Duong, Moonyong Lee
Abstract:
To increase precision and reliability of automatic control systems, we have to take into account of random factors affecting the control system. Thus, operational matrix technique is used for statistical analysis of first order plus time delay system with uniform random parameter. Examples with deterministic and stochastic disturbance are considered to demonstrate the validity of the method. Comparison with Monte Carlo method is made to show the computational effectiveness of the method.
Keywords: First order plus dead-time, Operational matrix, Statistical analysis, Walsh function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13661381 Deterministic Random Number Generator Algorithm for Cryptosystem Keys
Authors: Adi A. Maaita, Hamza A. A. Al_Sewadi
Abstract:
One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced, or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfill Shannon’s principle of “confusion and diffusion”. ASCII code characters were utilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.Keywords: Cryptosystems, Information Security agreement, Key distribution, Random numbers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34331380 Restricted Pedestrian Flow Performance Measures during Egress from a Complex Facility
Authors: Luthful A. Kawsar, Noraida A. Ghani, Anton A. Kamil, Adli Mustafa
Abstract:
In this paper, we use an M/G/C/C state dependent queuing model within a complex network topology to determine the different performance measures for pedestrian traffic flow. The occupants in this network topology need to go through some source corridors, from which they can choose their suitable exiting corridors. The performance measures were calculated using arrival rates that maximize the throughputs of source corridors. In order to increase the throughput of the network, the result indicates that the flow direction of pedestrian through the corridors has to be restricted and the arrival rates to the source corridor need to be controlled.Keywords: Arrival rate, Multiple arrival sources, Probability of blocking, State dependent queuing networks, Throughput.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15751379 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods
Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun
Abstract:
Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.
Keywords: Process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15891378 A Review in Recent Development of Network Threats and Security Measures
Authors: Roza Dastres, Mohsen Soori
Abstract:
Networks are vulnerable devices due to their basic feature of facilitating remote access and data communication. The information in the networks needs to be kept secured and safe in order to provide an effective communication and sharing device in the web of data. Due to challenges and threats of the data in networks, the network security is one of the most important considerations in information technology infrastructures. As a result, the security measures are considered in the network in order to decrease the probability of accessing the secured data by the hackers. The purpose of network security is to protect the network and its components from unauthorized access and abuse in order to provide a safe and secured communication device for the users. In the present research work a review in recent development of network threats and security measures is presented and future research works are also suggested. Different attacks to the networks and security measured against them are discussed in order to increase security in the web of data. So, new ideas in the network security systems can be presented by analyzing the published papers in order to move forward the research field.
Keywords: Network threats, network security, security measures, firewalls.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8371377 Vision Based Hand Gesture Recognition Using Generative and Discriminative Stochastic Models
Authors: Mahmoud Elmezain, Samar El-shinawy
Abstract:
Many approaches to pattern recognition are founded on probability theory, and can be broadly characterized as either generative or discriminative according to whether or not the distribution of the image features. Generative and discriminative models have very different characteristics, as well as complementary strengths and weaknesses. In this paper, we study these models to recognize the patterns of alphabet characters (A-Z) and numbers (0-9). To handle isolated pattern, generative model as Hidden Markov Model (HMM) and discriminative models like Conditional Random Field (CRF), Hidden Conditional Random Field (HCRF) and Latent-Dynamic Conditional Random Field (LDCRF) with different number of window size are applied on extracted pattern features. The gesture recognition rate is improved initially as the window size increase, but degrades as window size increase further. Experimental results show that the LDCRF is the best in terms of results than CRF, HCRF and HMM at window size equal 4. Additionally, our results show that; an overall recognition rates are 91.52%, 95.28%, 96.94% and 98.05% for CRF, HCRF, HMM and LDCRF respectively.
Keywords: Statistical Pattern Recognition, Generative Model, Discriminative Model, Human Computer Interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29371376 Theoretical Analysis of the Effect of Accounting for Special Methods in Similarity-Based Cohesion Measurement
Authors: Jehad Al Dallal
Abstract:
Class cohesion is an important object-oriented software quality attributes, and it refers to the degree of relatedness of class attributes and methods. Several class cohesion measures are proposed in the literature, and the impact of considering the special methods (i.e., constructors, destructors, and access and delegation methods) in cohesion calculation is not thoroughly theoretically studied for most of them. In this paper, we address this issue for three popular similarity-based class cohesion measures. For each of the considered measures we theoretically study the impact of including or excluding special methods on the values that are obtained by applying the measure. This study is based on analyzing the definitions and formulas that are proposed for the measures. The results show that including/excluding special methods has a considerable effect on the obtained cohesion values and that this effect varies from one measure to another. The study shows the importance of considering the types of methods that have to be accounted for when proposing a similarity-based cohesion measure.
Keywords: Object-oriented class, software quality, class cohesion measure, class cohesion, special methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16711375 Conflict of the Thai-Malaysian Gas Pipeline Project
Authors: Nopadol Burananuth
Abstract:
This research was aimed to investigate (1) the relationship among local social movements, non-governmental Organization activities and state measures deployment; and (2) the effects of local social movements, non-governmental Organization activities, and state measures deployment on conflict of local people towards the Thai-Malaysian gas pipeline project. These people included 1,000 residents of the four districts in Songkhla province. The methods of data analysis consist of multiple regression analysis. The results of the analysis showed that: (1) local social movements depended on information, and mass communication; deployment of state measures depended on compromise, coordination, and mass communication; and (2) the conflict of local people depended on mobilization, negotiation, and campaigning for participation of people in the project. Thus, it is recommended that to successfully implement any government policy, consideration must be paid to the conflict of local people, mobilization, negotiation, and campaigning for people’s participation in the project.Keywords: Conflict, NGO activities, social movements, state measures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12331374 Predicting Protein-Protein Interactions from Protein Sequences Using Phylogenetic Profiles
Authors: Omer Nebil Yaveroglu, Tolga Can
Abstract:
In this study, a high accuracy protein-protein interaction prediction method is developed. The importance of the proposed method is that it only uses sequence information of proteins while predicting interaction. The method extracts phylogenetic profiles of proteins by using their sequence information. Combining the phylogenetic profiles of two proteins by checking existence of homologs in different species and fitting this combined profile into a statistical model, it is possible to make predictions about the interaction status of two proteins. For this purpose, we apply a collection of pattern recognition techniques on the dataset of combined phylogenetic profiles of protein pairs. Support Vector Machines, Feature Extraction using ReliefF, Naive Bayes Classification, K-Nearest Neighborhood Classification, Decision Trees, and Random Forest Classification are the methods we applied for finding the classification method that best predicts the interaction status of protein pairs. Random Forest Classification outperformed all other methods with a prediction accuracy of 76.93%Keywords: Protein Interaction Prediction, Phylogenetic Profile, SVM , ReliefF, Decision Trees, Random Forest Classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16131373 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation
Authors: Constantin Z. Leshan
Abstract:
Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.
Keywords: Border of the universe, causality violation, perfect isolation, quantum jumps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12341372 Critical Assessment of Scoring Schemes for Protein-Protein Docking Predictions
Authors: Dhananjay C. Joshi, Jung-Hsin Lin
Abstract:
Protein-protein interactions (PPI) play a crucial role in many biological processes such as cell signalling, transcription, translation, replication, signal transduction, and drug targeting, etc. Structural information about protein-protein interaction is essential for understanding the molecular mechanisms of these processes. Structures of protein-protein complexes are still difficult to obtain by biophysical methods such as NMR and X-ray crystallography, and therefore protein-protein docking computation is considered an important approach for understanding protein-protein interactions. However, reliable prediction of the protein-protein complexes is still under way. In the past decades, several grid-based docking algorithms based on the Katchalski-Katzir scoring scheme were developed, e.g., FTDock, ZDOCK, HADDOCK, RosettaDock, HEX, etc. However, the success rate of protein-protein docking prediction is still far from ideal. In this work, we first propose a more practical measure for evaluating the success of protein-protein docking predictions,the rate of first success (RFS), which is similar to the concept of mean first passage time (MFPT). Accordingly, we have assessed the ZDOCK bound and unbound benchmarks 2.0 and 3.0. We also createda new benchmark set for protein-protein docking predictions, in which the complexes have experimentally determined binding affinity data. We performed free energy calculation based on the solution of non-linear Poisson-Boltzmann equation (nlPBE) to improve the binding mode prediction. We used the well-studied thebarnase-barstarsystem to validate the parameters for free energy calculations. Besides,thenlPBE-based free energy calculations were conducted for the badly predicted cases by ZDOCK and ZRANK. We found that direct molecular mechanics energetics cannot be used to discriminate the native binding pose from the decoys.Our results indicate that nlPBE-based calculations appeared to be one of the promising approaches for improving the success rate of binding pose predictions.
Keywords: protein-protein docking, protein-protein interaction, molecular mechanics energetics, Poisson-Boltzmann calculations
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18051371 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making
Authors: I. Arockiarani
Abstract:
The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.Keywords: Entropy measure, Hausdorff distance, neutrosophic set, soft set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9341370 Variation of Uncertainty in Steady And Non-Steady Processes Of Queuing Theory
Authors: Om Parkash, C.P.Gandhi
Abstract:
Probabilistic measures of uncertainty have been obtained as functions of time and birth and death rates in a queuing process. The variation of different entropy measures has been studied in steady and non-steady processes of queuing theory.Keywords: Uncertainty, steady state, non-steady state, trafficintensity, monotonocity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11851369 On the Effectivity of Different Pseudo-Noise and Orthogonal Sequences for Speech Encryption from Correlation Properties
Authors: V. Anil Kumar, Abhijit Mitra, S. R. Mahadeva Prasanna
Abstract:
We analyze the effectivity of different pseudo noise (PN) and orthogonal sequences for encrypting speech signals in terms of perceptual intelligence. Speech signal can be viewed as sequence of correlated samples and each sample as sequence of bits. The residual intelligibility of the speech signal can be reduced by removing the correlation among the speech samples. PN sequences have random like properties that help in reducing the correlation among speech samples. The mean square aperiodic auto-correlation (MSAAC) and the mean square aperiodic cross-correlation (MSACC) measures are used to test the randomness of the PN sequences. Results of the investigation show the effectivity of large Kasami sequences for this purpose among many PN sequences.
Keywords: Speech encryption, pseudo-noise codes, maximallength, Gold, Barker, Kasami, Walsh-Hadamard, autocorrelation, crosscorrelation, figure of merit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20411368 Influence of Fiber Packing on Transverse Plastic Properties of Metal Matrix Composites
Authors: Mohammad Tahaye Abadi
Abstract:
The present paper concerns with the influence of fiber packing on the transverse plastic properties of metal matrix composites. A micromechanical modeling procedure is used to predict the effective mechanical properties of composite materials at large tensile and compressive deformations. Microstructure is represented by a repeating unit cell (RUC). Two fiber arrays are considered including ideal square fiber packing and random fiber packing defined by random sequential algorithm. The micromechanical modeling procedure is implemented for graphite/aluminum metal matrix composite in which the reinforcement behaves as elastic, isotropic solids and the matrix is modeled as an isotropic elastic-plastic solid following the von Mises criterion with isotropic hardening and the Ramberg-Osgood relationship between equivalent true stress and logarithmic strain. The deformation is increased to a considerable value to evaluate both elastic and plastic behaviors of metal matrix composites. The yields strength and true elastic-plastic stress are determined for graphite/aluminum composites.Keywords: Fiber packing, metal matrix composites, micromechanics, plastic deformation, random
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16441367 Second Order Statistics of Dynamic Response of Structures Using Gamma Distributed Damping Parameters
Authors: B. Chemali, B. Tiliouine
Abstract:
This article presents the main results of a numerical investigation on the uncertainty of dynamic response of structures with statistically correlated random damping Gamma distributed. A computational method based on a Linear Statistical Model (LSM) is implemented to predict second order statistics for the response of a typical industrial building structure. The significance of random damping with correlated parameters and its implications on the sensitivity of structural peak response in the neighborhood of a resonant frequency are discussed in light of considerable ranges of damping uncertainties and correlation coefficients. The results are compared to those generated using Monte Carlo simulation techniques. The numerical results obtained show the importance of damping uncertainty and statistical correlation of damping coefficients when obtaining accurate probabilistic estimates of dynamic response of structures. Furthermore, the effectiveness of the LSM model to efficiently predict uncertainty propagation for structural dynamic problems with correlated damping parameters is demonstrated.Keywords: Correlated random damping, linear statistical model, Monte Carlo simulation, uncertainty of dynamic response.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18361366 Texture Feature Extraction using Slant-Hadamard Transform
Authors: M. J. Nassiri, A. Vafaei, A. Monadjemi
Abstract:
Random and natural textures classification is still one of the biggest challenges in the field of image processing and pattern recognition. In this paper, texture feature extraction using Slant Hadamard Transform was studied and compared to other signal processing-based texture classification schemes. A parametric SHT was also introduced and employed for natural textures feature extraction. We showed that a subtly modified parametric SHT can outperform ordinary Walsh-Hadamard transform and discrete cosine transform. Experiments were carried out on a subset of Vistex random natural texture images using a kNN classifier.Keywords: Texture Analysis, Slant Transform, Hadamard, DCT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2674