Search results for: randomness
66 Statistical Randomness Testing of Some Second Round Candidate Algorithms of CAESAR Competition
Authors: Fatih Sulak, Betül A. Özdemir, Beyza Bozdemir
Abstract:
In order to improve symmetric key research, several competitions had been arranged by organizations like National Institute of Standards and Technology (NIST) and International Association for Cryptologic Research (IACR). In recent years, the importance of authenticated encryption has rapidly increased because of the necessity of simultaneously enabling integrity, confidentiality and authenticity. Therefore, at January 2013, IACR announced the Competition for Authenticated Encryption: Security, Applicability, and Robustness (CAESAR Competition) which will select secure and efficient algorithms for authenticated encryption. Cryptographic algorithms are anticipated to behave like random mappings; hence, it is important to apply statistical randomness tests to the outputs of the algorithms. In this work, the statistical randomness tests in the NIST Test Suite and the other recently designed randomness tests are applied to six second round algorithms of the CAESAR Competition. It is observed that AEGIS achieves randomness after 3 rounds, Ascon permutation function achieves randomness after 1 round, Joltik encryption function achieves randomness after 9 rounds, Morus state update function achieves randomness after 3 rounds, Pi-cipher achieves randomness after 1 round, and Tiaoxin achieves randomness after 1 round.Keywords: authenticated encryption, CAESAR competition, NIST test suite, statistical randomness tests
Procedia PDF Downloads 31565 On the Analysis of Pseudorandom Partial Quotient Sequences Generated from Continued Fractions
Authors: T. Padma, Jayashree S. Pillai
Abstract:
Random entities are an essential component in any cryptographic application. The suitability of a number theory based novel pseudorandom sequence called Pseudorandom Partial Quotient Sequence (PPQS) generated from the continued fraction expansion of irrational numbers, in cryptographic applications, is analyzed in this paper. An approach to build the algorithm around a hard mathematical problem has been considered. The PQ sequence is tested for randomness and its suitability as a cryptographic key by performing randomness analysis, key sensitivity and key space analysis, precision analysis and evaluating the correlation properties is established.Keywords: pseudorandom sequences, key sensitivity, correlation, security analysis, randomness analysis, sensitivity analysis
Procedia PDF Downloads 59064 Design and Implementation of Pseudorandom Number Generator Using Android Sensors
Authors: Mochamad Beta Auditama, Yusuf Kurniawan
Abstract:
A smartphone or tablet require a strong randomness to establish secure encrypted communication, encrypt files, etc. Therefore, random number generation is one of the main keys to provide secrecy. Android devices are equipped with hardware-based sensors, such as accelerometer, gyroscope, etc. Each of these sensors provides a stochastic process which has a potential to be used as an extra randomness source, in addition to /dev/random and /dev/urandom pseudorandom number generators. Android sensors can provide randomness automatically. To obtain randomness from Android sensors, each one of Android sensors shall be used to construct an entropy source. After all entropy sources are constructed, output from these entropy sources are combined to provide more entropy. Then, a deterministic process is used to produces a sequence of random bits from the combined output. All of these processes are done in accordance with NIST SP 800-22 and the series of NIST SP 800-90. The operation conditions are done 1) on Android user-space, and 2) the Android device is placed motionless on a desk.Keywords: Android hardware-based sensor, deterministic process, entropy source, random number generation/generators
Procedia PDF Downloads 37463 Membership Surface and Arithmetic Operations of Imprecise Matrix
Authors: Dhruba Das
Abstract:
In this paper, a method has been developed to construct the membership surfaces of row and column vectors and arithmetic operations of imprecise matrix. A matrix with imprecise elements would be called an imprecise matrix. The membership surface of imprecise vector has been already shown based on Randomness-Impreciseness Consistency Principle. The Randomness- Impreciseness Consistency Principle leads to defining a normal law of impreciseness using two different laws of randomness. In this paper, the author has shown row and column membership surfaces and arithmetic operations of imprecise matrix and demonstrated with the help of numerical example.Keywords: imprecise number, imprecise vector, membership surface, imprecise matrix
Procedia PDF Downloads 38662 Imprecise Vector: The Case of Subnormality
Authors: Dhruba Das
Abstract:
In this article, the author has put forward the actual mathematical explanation of subnormal imprecise vector. Every subnormal imprecise vector has to be defined with reference to a membership surface. The membership surface of normal imprecise vector has already defined based on Randomness-Impreciseness Consistency Principle. The Randomness- Impreciseness Consistency Principle leads to defining a normal law of impreciseness using two different laws of randomness. A normal imprecise vector is a special case of subnormal imprecise vector. Nothing however is available in the literature about the membership surface when a subnormal imprecise vector is defined. The author has shown here how to construct the membership surface of a subnormal imprecise vector.Keywords: imprecise vector, membership surface, subnormal imprecise number, subnormal imprecise vector
Procedia PDF Downloads 32061 Software Verification of Systematic Resampling for Optimization of Particle Filters
Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey
Abstract:
Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking
Procedia PDF Downloads 8460 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 12459 Modification Encryption Time and Permutation in Advanced Encryption Standard Algorithm
Authors: Dalal N. Hammod, Ekhlas K. Gbashi
Abstract:
Today, cryptography is used in many applications to achieve high security in data transmission and in real-time communications. AES has long gained global acceptance and is used for securing sensitive data in various industries but has suffered from slow processing and take a large time to transfer data. This paper suggests a method to enhance Advance Encryption Standard (AES) Algorithm based on time and permutation. The suggested method (MAES) is based on modifying the SubByte and ShiftRrows in the encryption part and modification the InvSubByte and InvShiftRows in the decryption part. After the implementation of the proposal and testing the results, the Modified AES achieved good results in accomplishing the communication with high performance criteria in terms of randomness, encryption time, storage space, and avalanche effects. The proposed method has good randomness to ciphertext because this method passed NIST statistical tests against attacks; also, (MAES) reduced the encryption time by (10 %) than the time of the original AES; therefore, the modified AES is faster than the original AES. Also, the proposed method showed good results in memory utilization where the value is (54.36) for the MAES, but the value for the original AES is (66.23). Also, the avalanche effects used for calculating diffusion property are (52.08%) for the modified AES and (51.82%) percentage for the original AES.Keywords: modified AES, randomness test, encryption time, avalanche effects
Procedia PDF Downloads 24858 Generation of Symmetric Key Using Randomness of Hash Function
Authors: Sai Charan Kamana, Harsha Vardhan Nakkina, B.R. Chandavarkar
Abstract:
In a highly secure and robust key generation process, a key role is played by randomness and random numbers when current real-world cryptosystems are observed. Most of the present-day cryptographic protocols depend upon the Random Number Generators (RNG), Pseudo-Random Number Generator (PRNG). These protocols often use noisy channels such as Disk seek time, CPU temperature, Mouse pointer movement, Fan noise to obtain true random values. Despite being cost-effective, these noisy channels may need additional hardware devices to continuously communicate with them. On the other hand, Hash functions are Pseudo-Random (because of their requirements). So, they are a good replacement for these noisy channels and have low hardware requirements. This paper discusses, some of the key generation methodologies, and their drawbacks. This paper explains how hash functions can be used in key generation, how to combine Key Derivation Functions with hash functions.Keywords: key derivation, hash based key derivation, password based key derivation, symmetric key derivation
Procedia PDF Downloads 16157 Deterministic Random Number Generator Algorithm for Cryptosystem Keys
Authors: Adi A. Maaita, Hamza A. A. Al Sewadi
Abstract:
One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfills Shannon’s principle of “confusion and diffusion”. ASCII code characters wereutilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.Keywords: cryptosystems, information security agreement, key distribution, random numbers
Procedia PDF Downloads 26856 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics
Authors: Hongliang Zhang
Abstract:
The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.Keywords: cybertext, digital poetry, poetry generator, semiotics
Procedia PDF Downloads 17555 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test
Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea
Abstract:
In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence
Procedia PDF Downloads 9754 Thermal Contact Resistance of Nanoscale Rough Surfaces
Authors: Ravi Prasher
Abstract:
In nanostructured material thermal transport is dominated by contact resistance. Theoretical models describing thermal transport at interfaces assume perfectly flat surface whereas in reality surfaces can be rough with roughness ranging from sub-nanoscale dimension to micron scale. Here we introduce a model which includes both nanoscale contact mechanics and nanoscale heat transfer for rough nanoscale surfaces. This comprehensive model accounts for the effect of phonon acoustic mismatch, mechanical properties, chemical properties and randomness of the rough surface.Keywords: adhesion and contact resistance, Kaptiza resistance of rough surfaces, nanoscale thermal transport
Procedia PDF Downloads 36953 Rényi Entropy Correction to Expanding Universe
Authors: Hamidreza Fazlollahi
Abstract:
The Re ́nyi entropy comprises a group of data estimates that sums up the well-known Shannon entropy, acquiring a considerable lot of its properties. It appears as unqualified and restrictive entropy, relative entropy, or common data, and has found numerous applications in information theory. In the Re ́nyi’s argument, the area law of the black hole entropy plays a significant role. However, the total entropy can be modified by some quantum effects, motivated by the randomness of a system. In this note, by employing this modified entropy relation, we have derived corrections to Friedmann equations. Taking this entropy associated with the apparent horizon of the Friedmann-Robertson-Walker Universe and assuming the first law of thermodynamics, dE=T_A (dS)_A+WdV, satisfies the apparent horizon, we have reconsidered expanding Universe. Also, the second thermodynamics law has been examined.Keywords: Friedmann equations, dark energy, first law of thermodynamics, Reyni entropy
Procedia PDF Downloads 9452 Hyperchaos-Based Video Encryption for Device-To-Device Communications
Authors: Samir Benzegane, Said Sadoudi, Mustapha Djeddou
Abstract:
In this paper, we present a software development of video streaming encryption for Device-to-Device (D2D) communications by using Hyperchaos-based Random Number Generator (HRNG) implemented in C#. The software implements and uses the proposed HRNG to generate key stream for encrypting and decrypting real-time video data. The used HRNG consists of Hyperchaos Lorenz system which produces four signal outputs taken as encryption keys. The generated keys are characterized by high quality randomness which is confirmed by passing standard NIST statistical tests. Security analysis of the proposed encryption scheme confirms its robustness against different attacks.Keywords: hyperchaos Lorenz system, hyperchaos-based random number generator, D2D communications, C#
Procedia PDF Downloads 37151 Achieving Better Security by Using Nonlinear Cellular Automata as a Cryptographic Primitive
Authors: Swapan Maiti, Dipanwita Roy Chowdhury
Abstract:
Nonlinear functions are essential in different cryptoprimitives as they play an important role on the security of the cipher designs. Rule 30 was identified as a powerful nonlinear function for cryptographic applications. However, an attack (MS attack) was mounted against Rule 30 Cellular Automata (CA). Nonlinear rules as well as maximum period CA increase randomness property. In this work, nonlinear rules of maximum period nonlinear hybrid CA (M-NHCA) are studied and it is shown to be a better crypto-primitive than Rule 30 CA. It has also been analysed that the M-NHCA with single nonlinearity injection proposed in the literature is vulnerable against MS attack, whereas M-NHCA with multiple nonlinearity injections provide maximum length cycle as well as better cryptographic primitives and they are also secure against MS attack.Keywords: cellular automata, maximum period nonlinear CA, Meier and Staffelbach attack, nonlinear functions
Procedia PDF Downloads 31450 Chaotic Control, Masking and Secure Communication Approach of Supply Chain Attractor
Authors: Unal Atakan Kahraman, Yilmaz Uyaroğlu
Abstract:
The chaotic signals generated by chaotic systems have some properties such as randomness, complexity and sensitive dependence on initial conditions, which make them particularly suitable for secure communications. Since the 1990s, the problem of secure communication, based on chaos synchronization, has been thoroughly investigated and many methods, for instance, robust and adaptive control approaches, have been proposed to realize the chaos synchronization. In this paper, an improved secure communication model is proposed based on control of supply chain management system. Control and masking communication simulation results are used to visualize the effectiveness of chaotic supply chain system also performed on the application of secure communication to the chaotic system. So, we discover the secure phenomenon of chaos-amplification in supply chain systemKeywords: chaotic analyze, control, secure communication, supply chain attractor
Procedia PDF Downloads 51649 Modeling of System Availability and Bayesian Analysis of Bivariate Distribution
Authors: Muhammad Farooq, Ahtasham Gul
Abstract:
To meet the desired standard, it is important to monitor and analyze different engineering processes to get desired output. The bivariate distributions got a lot of attention in recent years to describe the randomness of natural as well as artificial mechanisms. In this article, a bivariate model is constructed using two independent models developed by the nesting approach to study the effect of each component on reliability for better understanding. Further, the Bayes analysis of system availability is studied by considering prior parametric variations in the failure time and repair time distributions. Basic statistical characteristics of marginal distribution, like mean median and quantile function, are discussed. We use inverse Gamma prior to study its frequentist properties by conducting Monte Carlo Markov Chain (MCMC) sampling scheme.Keywords: reliability, system availability Weibull, inverse Lomax, Monte Carlo Markov Chain, Bayesian
Procedia PDF Downloads 7148 Stochastic Variation of the Hubble's Parameter Using Ornstein-Uhlenbeck Process
Authors: Mary Chriselda A
Abstract:
This paper deals with the fact that the Hubble's parameter is not constant and tends to vary stochastically with time. This premise has been proven by converting it to a stochastic differential equation using the Ornstein-Uhlenbeck process. The formulated stochastic differential equation is further solved analytically using the Euler and the Kolmogorov Forward equations, thereby obtaining the probability density function using the Fourier transformation, thereby proving that the Hubble's parameter varies stochastically. This is further corroborated by simulating the observations using Python and R-software for validation of the premise postulated. We can further draw conclusion that the randomness in forces affecting the white noise can eventually affect the Hubble’s Parameter leading to scale invariance and thereby causing stochastic fluctuations in the density and the rate of expansion of the Universe.Keywords: Chapman Kolmogorov forward differential equations, fourier transformation, hubble's parameter, ornstein-uhlenbeck process , stochastic differential equations
Procedia PDF Downloads 20147 Design and Application of a Model Eliciting Activity with Civil Engineering Students on Binomial Distribution to Solve a Decision Problem Based on Samples Data Involving Aspects of Randomness and Proportionality
Authors: Martha E. Aguiar-Barrera, Humberto Gutierrez-Pulido, Veronica Vargas-Alejo
Abstract:
Identifying and modeling random phenomena is a fundamental cognitive process to understand and transform reality. Recognizing situations governed by chance and giving them a scientific interpretation, without being carried away by beliefs or intuitions, is a basic training for citizens. Hence the importance of generating teaching-learning processes, supported using technology, paying attention to model creation rather than only executing mathematical calculations. In order to develop the student's knowledge about basic probability distributions and decision making; in this work a model eliciting activity (MEA) is reported. The intention was applying the Model and Modeling Perspective to design an activity related to civil engineering that would be understandable for students, while involving them in its solution. Furthermore, the activity should imply a decision-making challenge based on sample data, and the use of the computer should be considered. The activity was designed considering the six design principles for MEA proposed by Lesh and collaborators. These are model construction, reality, self-evaluation, model documentation, shareable and reusable, and prototype. The application and refinement of the activity was carried out during three school cycles in the Probability and Statistics class for Civil Engineering students at the University of Guadalajara. The analysis of the way in which the students sought to solve the activity was made using audio and video recordings, as well as with the individual and team reports of the students. The information obtained was categorized according to the activity phase (individual or team) and the category of analysis (sample, linearity, probability, distributions, mechanization, and decision-making). With the results obtained through the MEA, four obstacles have been identified to understand and apply the binomial distribution: the first one was the resistance of the student to move from the linear to the probabilistic model; the second one, the difficulty of visualizing (infering) the behavior of the population through the sample data; the third one, viewing the sample as an isolated event and not as part of a random process that must be viewed in the context of a probability distribution; and the fourth one, the difficulty of decision-making with the support of probabilistic calculations. These obstacles have also been identified in literature on the teaching of probability and statistics. Recognizing these concepts as obstacles to understanding probability distributions, and that these do not change after an intervention, allows for the modification of these interventions and the MEA. In such a way, the students may identify themselves the erroneous solutions when they carrying out the MEA. The MEA also showed to be democratic since several students who had little participation and low grades in the first units, improved their participation. Regarding the use of the computer, the RStudio software was useful in several tasks, for example in such as plotting the probability distributions and to exploring different sample sizes. In conclusion, with the models created to solve the MEA, the Civil Engineering students improved their probabilistic knowledge and understanding of fundamental concepts such as sample, population, and probability distribution.Keywords: linear model, models and modeling, probability, randomness, sample
Procedia PDF Downloads 11846 Quantifying Freeway Capacity Reductions by Rainfall Intensities Based on Stochastic Nature of Flow Breakdown
Authors: Hoyoung Lee, Dong-Kyu Kim, Seung-Young Kho, R. Eddie Wilson
Abstract:
This study quantifies a decrement in freeway capacity during rainfall. Traffic and rainfall data were gathered from Highway Agencies and Wunderground weather service. Three inter-urban freeway sections and its nearest weather stations were selected as experimental sites. Capacity analysis found reductions of maximum and mean pre-breakdown flow rates due to rainfall. The Kruskal-Wallis test also provided some evidence to suggest that the variance in the pre-breakdown flow rate is statistically insignificant. Potential application of this study lies in the operation of real time traffic management schemes such as Variable Speed Limits (VSL), Hard Shoulder Running (HSR), and Ramp Metering System (RMS), where speed or flow limits could be set based on a number of factors, including rainfall events and their intensities.
Keywords: capacity randomness, flow breakdown, freeway capacity, rainfall
Procedia PDF Downloads 38145 Performance Complexity Measurement of Tightening Equipment Based on Kolmogorov Entropy
Authors: Guoliang Fan, Aiping Li, Xuemei Liu, Liyun Xu
Abstract:
The performance of the tightening equipment will decline with the working process in manufacturing system. The main manifestations are the randomness and discretization degree increasing of the tightening performance. To evaluate the degradation tendency of the tightening performance accurately, a complexity measurement approach based on Kolmogorov entropy is presented. At first, the states of performance index are divided for calibrating the discrete degree. Then the complexity measurement model based on Kolmogorov entropy is built. The model describes the performance degradation tendency of tightening equipment quantitatively. At last, a study case is applied for verifying the efficiency and validity of the approach. The research achievement shows that the presented complexity measurement can effectively evaluate the degradation tendency of the tightening equipment. It can provide theoretical basis for preventive maintenance and life prediction of equipment.Keywords: complexity measurement, Kolmogorov entropy, manufacturing system, performance evaluation, tightening equipment
Procedia PDF Downloads 25944 A New Concept for Deriving the Expected Value of Fuzzy Random Variables
Authors: Liang-Hsuan Chen, Chia-Jung Chang
Abstract:
Fuzzy random variables have been introduced as an imprecise concept of numeric values for characterizing the imprecise knowledge. The descriptive parameters can be used to describe the primary features of a set of fuzzy random observations. In fuzzy environments, the expected values are usually represented as fuzzy-valued, interval-valued or numeric-valued descriptive parameters using various metrics. Instead of the concept of area metric that is usually adopted in the relevant studies, the numeric expected value is proposed by the concept of distance metric in this study based on two characters (fuzziness and randomness) of FRVs. Comparing with the existing measures, although the results show that the proposed numeric expected value is same with those using the different metric, if only triangular membership functions are used. However, the proposed approach has the advantages of intuitiveness and computational efficiency, when the membership functions are not triangular types. An example with three datasets is provided for verifying the proposed approach.Keywords: fuzzy random variables, distance measure, expected value, descriptive parameters
Procedia PDF Downloads 34343 Valuation of Caps and Floors in a LIBOR Market Model with Markov Jump Risks
Authors: Shih-Kuei Lin
Abstract:
The characterization of the arbitrage-free dynamics of interest rates is developed in this study under the presence of Markov jump risks, when the term structure of the interest rates is modeled through simple forward rates. We consider Markov jump risks by allowing randomness in jump sizes, independence between jump sizes and jump times. The Markov jump diffusion model is used to capture empirical phenomena and to accurately describe interest jump risks in a financial market. We derive the arbitrage-free model of simple forward rates under the spot measure. Moreover, the analytical pricing formulas for a cap and a floor are derived under the forward measure when the jump size follows a lognormal distribution. In our empirical analysis, we find that the LIBOR market model with Markov jump risk better accounts for changes from/to different states and different rates.Keywords: arbitrage-free, cap and floor, Markov jump diffusion model, simple forward rate model, volatility smile, EM algorithm
Procedia PDF Downloads 42142 Spaces of Interpretation: Personal Space
Authors: Yehuda Roth
Abstract:
In quantum theory, a system’s time evolution is predictable unless an observer performs measurement, as the measurement process can randomize the system. This randomness appears when the measuring device does not accurately describe the measured item, i.e., when the states characterizing the measuring device appear as a superposition of those being measured. When such a mismatch occurs, the measured data randomly collapse into a single eigenstate of the measuring device. This scenario resembles the interpretation process in which the observer does not experience an objective reality but interprets it based on preliminary descriptions initially ingrained into his/her mind. This distinction is the motivation for the present study in which the collapse scenario is regarded as part of the interpretation process of the observer. By adopting the formalism of the quantum theory, we present a complete mathematical approach that describes the interpretation process. We demonstrate this process by applying the proposed interpretation formalism to the ambiguous image "My wife and mother-in-law" to identify whether a woman in the picture is young or old.Keywords: quantum-like interpretation, ambiguous image, determination, quantum-like collapse, classified representation
Procedia PDF Downloads 10441 An Axiomatic Approach to Constructing an Applied Theory of Possibility
Authors: Oleksii Bychkov
Abstract:
The fundamental difference between randomness and vagueness is that the former requires statistical research. These issues were studied by Zadeh L, Dubois D., Prad A. The theory of possibility works with expert assessments, hypotheses, etc. gives an idea of the characteristics of the problem situation, the nature of the goals and real limitations. Possibility theory examines experiments that are not repeated. The article discusses issues related to the formalization of a fuzzy, uncertain idea of reality. The author proposes to expand the classical model of the theory of possibilities by introducing a measure of necessity. The proposed model of the theory of possibilities allows us to extend the measures of possibility and necessity onto a Boolean while preserving the properties of the measure. Thus, upper and lower estimates are obtained to describe the fact that the event will occur. Knowledge of the patterns that govern mass random, uncertain, fuzzy events allows us to predict how these events will proceed. The article proposed for publication quite fully reveals the essence of the construction and use of the theory of probability and the theory of possibility.Keywords: possibility, artificial, modeling, axiomatics, intellectual approach
Procedia PDF Downloads 3240 Elitist Self-Adaptive Step-Size Search in Optimum Sizing of Steel Structures
Authors: Oğuzhan Hasançebi, Saeid Kazemzadeh Azad
Abstract:
Keywords: structural design optimization, optimal sizing, metaheuristics, self-adaptive step-size search, steel trusses, steel frames
Procedia PDF Downloads 37539 Comparison of Different Machine Learning Models for Time-Series Based Load Forecasting of Electric Vehicle Charging Stations
Authors: H. J. Joshi, Satyajeet Patil, Parth Dandavate, Mihir Kulkarni, Harshita Agrawal
Abstract:
As the world looks towards a sustainable future, electric vehicles have become increasingly popular. Millions worldwide are looking to switch to Electric cars over the previously favored combustion engine-powered cars. This demand has seen an increase in Electric Vehicle Charging Stations. The big challenge is that the randomness of electrical energy makes it tough for these charging stations to provide an adequate amount of energy over a specific amount of time. Thus, it has become increasingly crucial to model these patterns and forecast the energy needs of power stations. This paper aims to analyze how different machine learning models perform on Electric Vehicle charging time-series data. The data set consists of authentic Electric Vehicle Data from the Netherlands. It has an overview of ten thousand transactions from public stations operated by EVnetNL.Keywords: forecasting, smart grid, electric vehicle load forecasting, machine learning, time series forecasting
Procedia PDF Downloads 10638 Effect of Specimen Thickness on Probability Distribution of Grown Crack Size in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
The fatigue crack growth is stochastic because of the fatigue behavior having an uncertainty and a randomness. Therefore, it is necessary to determine the probability distribution of a grown crack size at a specific fatigue crack propagation life for maintenance of structure as well as reliability estimation. The essential purpose of this study is to present the good probability distribution fit for the grown crack size at a specified fatigue life in a rolled magnesium alloy under different specimen thickness conditions. Fatigue crack propagation experiments are carried out in laboratory air under three conditions of specimen thickness using AZ31 to investigate a stochastic crack growth behavior. The goodness-of-fit test for probability distribution of a grown crack size under different specimen thickness conditions is performed by Anderson-Darling test. The effect of a specimen thickness on variability of a grown crack size is also investigated.Keywords: crack size, fatigue crack propagation, magnesium alloys, probability distribution, specimen thickness
Procedia PDF Downloads 49937 Empirical Study of Running Correlations in Exam Marks: Same Statistical Pattern as Chance
Authors: Weisi Guo
Abstract:
It is well established that there may be running correlations in sequential exam marks due to students sitting in the order of course registration patterns. As such, a random and non-sequential sampling of exam marks is a standard recommended practice. Here, the paper examines a large number of exam data stretching several years across different modules to see the degree to which it is true. Using the real mark distribution as a generative process, it was found that random simulated data had no more sequential randomness than the real data. That is to say, the running correlations that one often observes are statistically identical to chance. Digging deeper, it was found that some high running correlations have students that indeed share a common course history and make similar mistakes. However, at the statistical scale of a module question, the combined effect is statistically similar to the random shuffling of papers. As such, there may not be the need to take random samples for marks, but it still remains good practice to mark papers in a random sequence to reduce the repetitive marking bias and errors.Keywords: data analysis, empirical study, exams, marking
Procedia PDF Downloads 181