Search results for: classical theory of probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6487

Search results for: classical theory of probability

6427 Theorem on Inconsistency of The Classical Logic

Authors: T. J. Stepien, L. T. Stepien

Abstract:

This abstract concerns an extremely fundamental issue. Namely, the fundamental problem of science is the issue of consistency. In this abstract, we present the theorem saying that the classical calculus of quantifiers is inconsistent in the traditional sense. At the beginning, we introduce a notation, and later we remind the definition of the consistency in the traditional sense. S1 is the set of all well-formed formulas in the calculus of quantifiers. RS1 denotes the set of all rules over the set S1. Cn(R, X) is the set of all formulas standardly provable from X by rules R, where R is a subset of RS1, and X is a subset of S1. The couple < R,X > is called a system, whenever R is a subset of RS1, and X is a subset of S1. Definition: The system < R,X > is consistent in the traditional sense if there does not exist any formula from the set S1, such that this formula and its negation are provable from X, by using rules from R. Finally, < R0+, L2 > denotes the classical calculus of quantifiers, where R0+ consists of Modus Ponens and the generalization rule. L2 is the set of all formulas valid in the classical calculus of quantifiers. The Main Result: The system < R0+, L2 > is inconsistent in the traditional sense.

Keywords: classical calculus of quantifiers, classical predicate calculus, generalization rule, consistency in the traditional sense, Modus Ponens

Procedia PDF Downloads 191
6426 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: cognitive radio, energy detector, periodogram, spectrum sensing

Procedia PDF Downloads 364
6425 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies

Authors: Yuanjin Liu

Abstract:

Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.

Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model

Procedia PDF Downloads 64
6424 Gravitational Frequency Shifts for Photons and Particles

Authors: Jing-Gang Xie

Abstract:

The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.

Keywords: general relativity theory, particles, photons, Quantum Gravity Model, gravitational frequency shift

Procedia PDF Downloads 349
6423 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 570
6422 Solution of Insurance Pricing Model Giving Optimum Premium Level for Both Insured and Insurer by Game Theory

Authors: Betul Zehra Karagul

Abstract:

A game consists of strategies that each actor has in his/her own choice strategies, and a game regulates the certain rules in the strategies that the actors choose, express how they evaluate their knowledge and the utility of output results. Game theory examines the human behaviors (preferences) of strategic situations in which each actor of a game regards the action that others will make in spite of his own moves. There is a balance between each player playing a game with the final number of players and the player with a certain probability of choosing the players, and this is called Nash equilibrium. The insurance is a two-person game where the insurer and insured are the actors. Both sides have the right to act in favor of utility functions. The insured has to pay a premium to buy the insurance cover. The insured will want to pay a low premium while the insurer is willing to get a high premium. In this study, the state of equilibrium for insurance pricing was examined in terms of the insurer and insured with game theory.

Keywords: game theory, insurance pricing, Nash equilibrium, utility function

Procedia PDF Downloads 344
6421 Analytical Downlink Effective SINR Evaluation in LTE Networks

Authors: Marwane Ben Hcine, Ridha Bouallegue

Abstract:

The aim of this work is to provide an original analytical framework for downlink effective SINR evaluation in LTE networks. The classical single carrier SINR performance evaluation is extended to multi-carrier systems operating over frequency selective channels. Extension is achieved by expressing the link outage probability in terms of the statistics of the effective SINR. For effective SINR computation, the exponential effective SINR mapping (EESM) method is used on this work. Closed-form expression for the link outage probability is achieved assuming a log skew normal approximation for single carrier case. Then we rely on the lognormal approximation to express the exponential effective SINR distribution as a function of the mean and standard deviation of the SINR of a generic subcarrier. Achieved formulas is easily computable and can be obtained for a user equipment (UE) located at any distance from its serving eNodeB. Simulations show that the proposed framework provides results with accuracy within 0.5 dB.

Keywords: LTE, OFDMA, effective SINR, log skew normal approximation

Procedia PDF Downloads 351
6420 Contingent Presences in Architecture: Vitruvian Theory as a Beginning

Authors: Zelal Çınar

Abstract:

This paper claims that architecture is a contingent discipline, despite the fact that its contingency has long been denied through a retreat to Vitruvian writing. It is evident that contingency is rejected not only by architecture but also by modernity as a whole. Vitruvius attempted to cover the entire field of architecture in a systematic form in order to bring the whole body of this great discipline to a complete order. The legacy of his theory hitherto lasted not only that it is the only major work on the architecture of Classical Antiquity to have survived, but also that its conformity with the project of modernity. In the scope of the paper, it will be argued that contingency should be taken into account rather than avoided as a potential threat.

Keywords: architecture, contingency, modernity, Vitruvius

Procedia PDF Downloads 275
6419 Molecular Dynamics Simulation of Free Vibration of Graphene Sheets

Authors: Seyyed Feisal Asbaghian Namin, Reza Pilafkan, Mahmood Kaffash Irzarahimi

Abstract:

TThis paper considers vibration of single-layered graphene sheets using molecular dynamics (MD) and nonlocal elasticity theory. Based on the MD simulations, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS), an open source software, is used to obtain fundamental frequencies. On the other hand, governing equations are derived using nonlocal elasticity and first order shear deformation theory (FSDT) and solved using generalized differential quadrature method (GDQ). The small-scale effect is applied in governing equations of motion by nonlocal parameter. The effect of different side lengths, boundary conditions and nonlocal parameter are inspected for aforementioned methods. Results are obtained from MD simulations is compared with those of the nonlocal elasticity theory to calculate appropriate values for the nonlocal parameter. The nonlocal parameter value is suggested for graphene sheets with various boundary conditions. Furthermore, it is shown that the nonlocal elasticity approach using classical plate theory (CLPT) assumptions overestimates the natural frequencies.

Keywords: graphene sheets, molecular dynamics simulations, fundamental frequencies, nonlocal elasticity theory, nonlocal parameter

Procedia PDF Downloads 502
6418 How to Assess the Attractiveness of Business Location According to the Mainstream Concepts of Comparative Advantages

Authors: Philippe Gugler

Abstract:

Goal of the study: The concept of competitiveness has been addressed by economic theorists and policymakers for several hundreds of years, with both groups trying to understand the drivers of economic prosperity and social welfare. The goal of this contribution is to address the major useful theoretical contributions that permit to identify the main drivers of a territory’s competitiveness. We first present the major contributions found in the classical and neo-classical theories. Then, we concentrate on two majors schools providing significant thoughts on the competitiveness of locations: the Economic Geography (EG) School and the International Business (IB) School. Methodology: The study is based on a literature review of the classical and neo-classical theories, on the economic geography theories and on the international business theories. This literature review establishes links between these theoretical mainstreams. This work is based on the academic framework establishing a meaningful literature review aimed to respond to our research question and to develop further research in this field. Results: The classical and neo-classical pioneering theories provide initial insights that territories are different and that these differences explain the discrepancies in their levels of prosperity and standards of living. These theories emphasized different factors impacting the level and the growth of productivity in a given area and therefore the degree of their competitiveness. However, these theories are not sufficient to more precisely identify the drivers and enablers of location competitiveness and to explain, in particular, the factors that drive the creation of economic activities, the expansion of economic activities, the creation of new firms and the attraction of foreign firms. Prosperity is due to economic activities created by firms. Therefore, we need more theoretical insights to scrutinize the competitive advantages of territories or, in other words, their ability to offer the best conditions that enable economic agents to achieve higher rates of productivity in open markets. Two major theories provide, to a large extent, the needed insights: the economic geography theory and the international business theory. The economic geography studies scrutinized in this study from Marshall to Porter, aim to explain the drivers of the concentration of specific industries and activities in specific locations. These activity agglomerations may be due to the creation of new enterprises, the expansion of existing firms, and the attraction of firms located elsewhere. Regarding this last possibility, the international business (IB) theories focus on the comparative advantages of locations as far as multinational enterprises (MNEs) strategies are concerned. According to international business theory, the comparative advantages of a location serves firms not only by exploiting their ownership advantages (mostly as far as market seeking, resource seeking and efficiency seeking investments are concerned) but also by augmenting and/or creating new ownership advantages (strategic asset seeking investments). The impact of a location on the competitiveness of firms is considered from both sides: the MNE’s home country and the MNE’s host country.

Keywords: competitiveness, economic geography, international business, attractiveness of businesses

Procedia PDF Downloads 132
6417 COVID-19 Teaches Probability Risk Assessment

Authors: Sean Sloan

Abstract:

Probability Risk Assessments (PRA) can be a difficult concept for students to grasp. So in searching for different ways to describe PRA to relate it to their lives; COVID-19 came up. The parallels are amazing. Soon students began analyzing acceptable risk with the virus. This helped them to quantify just how dangerous is dangerous. The original lesson was dismissed and for the remainder of the period, the probability of risk, and the lethality of risk became the topic. Spreading events such as a COVID carrier on an airline became analogous to single fault casualties such as a Tsunami. Odds of spreading became odds of backup-diesel-generator failure – like with Fukashima Daiichi. Fatalities of the disease became expected fatalities due to radiation spread. Quantification from this discussion took it from hyperbole and emotion into one where we could rationally base guidelines. It has been one of the most effective educational devices observed.

Keywords: COVID, education, probability, risk

Procedia PDF Downloads 146
6416 Construction of Graph Signal Modulations via Graph Fourier Transform and Its Applications

Authors: Xianwei Zheng, Yuan Yan Tang

Abstract:

Classical window Fourier transform has been widely used in signal processing, image processing, machine learning and pattern recognition. The related Gabor transform is powerful enough to capture the texture information of any given dataset. Recently, in the emerging field of graph signal processing, researchers devoting themselves to develop a graph signal processing theory to handle the so-called graph signals. Among the new developing theory, windowed graph Fourier transform has been constructed to establish a time-frequency analysis framework of graph signals. The windowed graph Fourier transform is defined by using the translation and modulation operators of graph signals, following the similar calculations in classical windowed Fourier transform. Specifically, the translation and modulation operators of graph signals are defined by using the Laplacian eigenvectors as follows. For a given graph signal, its translation is defined by a similar manner as its definition in classical signal processing. Specifically, the translation operator can be defined by using the Fourier atoms; the graph signal translation is defined similarly by using the Laplacian eigenvectors. The modulation of the graph can also be established by using the Laplacian eigenvectors. The windowed graph Fourier transform based on these two operators has been applied to obtain time-frequency representations of graph signals. Fundamentally, the modulation operator is defined similarly to the classical modulation by multiplying a graph signal with the entries in each Fourier atom. However, a single Laplacian eigenvector entry cannot play a similar role as the Fourier atom. This definition ignored the relationship between the translation and modulation operators. In this paper, a new definition of the modulation operator is proposed and thus another time-frequency framework for graph signal is constructed. Specifically, the relationship between the translation and modulation operations can be established by the Fourier transform. Specifically, for any signal, the Fourier transform of its translation is the modulation of its Fourier transform. Thus, the modulation of any signal can be defined as the inverse Fourier transform of the translation of its Fourier transform. Therefore, similarly, the graph modulation of any graph signal can be defined as the inverse graph Fourier transform of the translation of its graph Fourier. The novel definition of the graph modulation operator established a relationship of the translation and modulation operations. The new modulation operation and the original translation operation are applied to construct a new framework of graph signal time-frequency analysis. Furthermore, a windowed graph Fourier frame theory is developed. Necessary and sufficient conditions for constructing windowed graph Fourier frames, tight frames and dual frames are presented in this paper. The novel graph signal time-frequency analysis framework is applied to signals defined on well-known graphs, e.g. Minnesota road graph and random graphs. Experimental results show that the novel framework captures new features of graph signals.

Keywords: graph signals, windowed graph Fourier transform, windowed graph Fourier frames, vertex frequency analysis

Procedia PDF Downloads 327
6415 A Game-Theory-Based Price-Optimization Algorithm for the Simulation of Markets Using Agent-Based Modelling

Authors: Juan Manuel Sanchez-Cartas, Gonzalo Leon

Abstract:

A price competition algorithm for ABMs based on game theory principles is proposed to deal with the simulation of theoretical market models. The algorithm is applied to the classical Hotelling’s model and to a two-sided market model to show it leads to the optimal behavior predicted by theoretical models. However, when theoretical models fail to predict the equilibrium, the algorithm is capable of reaching a feasible outcome. Results highlight that the algorithm can be implemented in other simulation models to guarantee rational users and endogenous optimal behaviors. Also, it can be applied as a tool of verification given that is theoretically based.

Keywords: agent-based models, algorithmic game theory, multi-sided markets, price optimization

Procedia PDF Downloads 432
6414 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: reliability approach, storage tanks, monte carlo simulation, seismic acceleration

Procedia PDF Downloads 296
6413 A Hazard Rate Function for the Time of Ruin

Authors: Sule Sahin, Basak Bulut Karageyik

Abstract:

This paper introduces a hazard rate function for the time of ruin to calculate the conditional probability of ruin for very small intervals. We call this function the force of ruin (FoR). We obtain the expected time of ruin and conditional expected time of ruin from the exact finite time ruin probability with exponential claim amounts. Then we introduce the FoR which gives the conditional probability of ruin and the condition is that ruin has not occurred at time t. We analyse the behavior of the FoR function for different initial surpluses over a specific time interval. We also obtain FoR under the excess of loss reinsurance arrangement and examine the effect of reinsurance on the FoR.

Keywords: conditional time of ruin, finite time ruin probability, force of ruin, reinsurance

Procedia PDF Downloads 390
6412 Saliency Detection Using a Background Probability Model

Authors: Junling Li, Fang Meng, Yichun Zhang

Abstract:

Image saliency detection has been long studied, while several challenging problems are still unsolved, such as detecting saliency inaccurately in complex scenes or suppressing salient objects in the image borders. In this paper, we propose a new saliency detection algorithm in order to solving these problems. We represent the image as a graph with superixels as nodes. By considering appearance similarity between the boundary and the background, the proposed method chooses non-saliency boundary nodes as background priors to construct the background probability model. The probability that each node belongs to the model is computed, which measures its similarity with backgrounds. Thus we can calculate saliency by the transformed probability as a metric. We compare our algorithm with ten-state-of-the-art salient detection methods on the public database. Experimental results show that our simple and effective approach can attack those challenging problems that had been baffling in image saliency detection.

Keywords: visual saliency, background probability, boundary knowledge, background priors

Procedia PDF Downloads 416
6411 Subjective Probability and the Intertemporal Dimension of Probability to Correct the Misrelation Between Risk and Return of a Financial Asset as Perceived by Investors. Extension of Prospect Theory to Better Describe Risk Aversion

Authors: Roberta Martino, Viviana Ventre

Abstract:

From a theoretical point of view, the relationship between the risk associated with an investment and the expected value are directly proportional, in the sense that the market allows a greater result to those who are willing to take a greater risk. However, empirical evidence proves that this relationship is distorted in the minds of investors and is perceived exactly the opposite. To deepen and understand the discrepancy between the actual actions of the investor and the theoretical predictions, this paper analyzes the essential parameters used for the valuation of financial assets with greater attention to two elements: probability and the passage of time. Although these may seem at first glance to be two distinct elements, they are closely related. In particular, the error in the theoretical description of the relationship between risk and return lies in the failure to consider the impatience that is generated in the decision-maker when events that have not yet happened occur in the decision-making context. In this context, probability loses its objective meaning and in relation to the psychological aspects of the investor, it can only be understood as the degree of confidence that the investor has in the occurrence or non-occurrence of an event. Moreover, the concept of objective probability does not consider the inter-temporality that characterizes financial activities and does not consider the condition of limited cognitive capacity of the decision maker. Cognitive psychology has made it possible to understand that the mind acts with a compromise between quality and effort when faced with very complex choices. To evaluate an event that has not yet happened, it is necessary to imagine that it happens in your head. This projection into the future requires a cognitive effort and is what differentiates choices under conditions of risk and choices under conditions of uncertainty. In fact, since the receipt of the outcome in choices under risk conditions is imminent, the mechanism of self-projection into the future is not necessary to imagine the consequence of the choice and the decision makers dwell on the objective analysis of possibilities. Financial activities, on the other hand, develop over time and the objective probability is too static to consider the anticipatory emotions that the self-projection mechanism generates in the investor. Assuming that uncertainty is inherent in valuations of events that have not yet occurred, the focus must shift from risk management to uncertainty management. Only in this way the intertemporal dimension of the decision-making environment and the haste generated by the financial market can be cautioned and considered. The work considers an extension of the prospectus theory with the temporal component with the aim of providing a description of the attitude towards risk with respect to the passage of time.

Keywords: impatience, risk aversion, subjective probability, uncertainty

Procedia PDF Downloads 100
6410 Comparing Sounds of the Singing Voice

Authors: Christel Elisabeth Bonin

Abstract:

This experiment aims at showing that classical singing and belting have both different singing qualities, but singing with a speaking voice has no singing quality. For this purpose, a singing female voice was recorded on four different tone pitches, singing the vowel ‘a’ by using 3 different kinds of singing - classical trained voice, belting voice and speaking voice. The recordings have been entered in the Software Praat. Then the formants of each recorded tone were compared to each other and put in relationship to the singer’s formant. The visible results are taken as an indicator of comparable sound qualities of a classical trained female voice and a belting female voice concerning the concentration of overtones in F1 to F5 and a lack of sound quality in the speaking voice for singing purpose. The results also show that classical singing and belting are both valuable vocal techniques for singing due to their richness of overtones and that belting is not comparable to shouting or screaming. Singing with a speaking voice in contrast should not be called singing due to the lack of overtones which means by definition that there is no musical tone.

Keywords: formants, overtone, singer’s formant, singing voice, belting, classical singing, singing with the speaking voice

Procedia PDF Downloads 317
6409 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement: A Case Study

Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák

Abstract:

Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.

Keywords: failure, pavement, probability, reliability index, simulation, tensile crack

Procedia PDF Downloads 535
6408 Existence Theory for First Order Functional Random Differential Equations

Authors: Rajkumar N. Ingle

Abstract:

In this paper, the existence of a solution of nonlinear functional random differential equations of the first order is proved under caratheodory condition. The study of the functional random differential equation has got importance in the random analysis of the dynamical systems of universal phenomena. Objectives: Nonlinear functional random differential equation is useful to the scientists, engineers, and mathematicians, who are engaged in N.F.R.D.E. analyzing a universal random phenomenon, govern by nonlinear random initial value problems of D.E. Applications of this in the theory of diffusion or heat conduction. Methodology: Using the concepts of probability theory, functional analysis, generally the existence theorems for the nonlinear F.R.D.E. are prove by using some tools such as fixed point theorem. The significance of the study: Our contribution will be the generalization of some well-known results in the theory of Nonlinear F.R.D.E.s. Further, it seems that our study will be useful to scientist, engineers, economists and mathematicians in their endeavors to analyses the nonlinear random problems of the universe in a better way.

Keywords: Random Fixed Point Theorem, functional random differential equation, N.F.R.D.E., universal random phenomenon

Procedia PDF Downloads 480
6407 The Magic Bullet in Africa: Exploring an Alternative Theoretical Model

Authors: Daniel Nkrumah

Abstract:

The Magic Bullet theory was a popular media effect theory that defined the power of the mass media in altering beliefs and perceptions of its audiences. However, following the People's Choice study, the theory was said to have been disproved and was supplanted by the Two-Step Flow Theory. This paper examines the relevance of the Magic Bullet theory in Africa and establishes whether it is still relevant in Africa's media spaces and societies. Using selected cases on the continent, it adopts a grounded theory approach and explores a new theoretical model that attempts to enforce an argument that the Two-Step Flow theory though important and valid, was ill-conceived as a direct replacement to the Magic Bullet theory.

Keywords: magic bullet theory, two-step flow theory, media effects, african media

Procedia PDF Downloads 111
6406 [Keynote Talk]: Evidence Fusion in Decision Making

Authors: Mohammad Abdullah-Al-Wadud

Abstract:

In the current era of automation and artificial intelligence, different systems have been increasingly keeping on depending on decision-making capabilities of machines. Such systems/applications may range from simple classifiers to sophisticated surveillance systems based on traditional sensors and related equipment which are becoming more common in the internet of things (IoT) paradigm. However, the available data for such problems are usually imprecise and incomplete, which leads to uncertainty in decisions made based on traditional probability-based classifiers. This requires a robust fusion framework to combine the available information sources with some degree of certainty. The theory of evidence can provide with such a method for combining evidence from different (may be unreliable) sources/observers. This talk will address the employment of the Dempster-Shafer Theory of evidence in some practical applications.

Keywords: decision making, dempster-shafer theory, evidence fusion, incomplete data, uncertainty

Procedia PDF Downloads 412
6405 Pairwise Relative Primality of Integers and Independent Sets of Graphs

Authors: Jerry Hu

Abstract:

Let G = (V, E) with V = {1, 2, ..., k} be a graph, the k positive integers a₁, a₂, ..., ak are G-wise relatively prime if (aᵢ, aⱼ ) = 1 for {i, j} ∈ E. We use an inductive approach to give an asymptotic formula for the number of k-tuples of integers that are G-wise relatively prime. An exact formula is obtained for the probability that k positive integers are G-wise relatively prime. As a corollary, we also provide an exact formula for the probability that k positive integers have exactly r relatively prime pairs.

Keywords: graph, independent set, G-wise relatively prime, probability

Procedia PDF Downloads 80
6404 Static and Dynamic Analysis of Timoshenko Microcantilever Using the Finite Element Method

Authors: Mohammad Tahmasebipour, Hosein Salarpour

Abstract:

Micro cantilevers are one of the components used in the manufacture of micro-electromechanical systems. Epoxy microcantilevers have a variety of applications in the manufacture of micro-sensors and micro-actuators. In this paper, the Timoshenko Micro cantilever was statically and dynamically analyzed using the finite element method. First, all boundary conditions and initial conditions governing micro cantilevers were considered. The effect of size on the deflection, angle of rotation, natural frequencies, and mode shapes were then analyzed and evaluated under different frequencies. It was observed that an increased micro cantilever thickness reduces the deflection, rotation, and resonant frequency. A good agreement was observed between our results and those obtained by the couple stress theory, the classical theory, and the strain gradient elasticity theory.

Keywords: microcantilever, microsensor; epoxy, dynamic behavior, static behavior, finite element method

Procedia PDF Downloads 407
6403 Experimental and Computational Analysis of Glass Fiber Reinforced Plastic Beams with Piezoelectric Fibers

Authors: Selin Kunc, Srinivas Koushik Gundimeda, John A. Gallagher, Roselita Fragoudakis

Abstract:

This study investigates the behavior of Glass Fiber Reinforced Plastic (GFRP) laminated beams additionally reinforced with piezoelectric fibers. The electromechanical behavior of piezoelectric materials coupled with high strength/low weight GFRP laminated beams can have significant application in a wide range of industries. Energy scavenging through mechanical vibrations is the focus of this study, and possible applications can be seen in the automotive industry. This study examines the behavior of such composite laminates using Classical Lamination Theory (CLT) under three-point bending conditions. Fiber orientation is optimized for the desired stiffness and deflection that yield maximum energy output. Finite element models using ABAQUS/CAE are verified through experimental testing. The optimum stacking sequences examined are [0o]s, [ 0/45o]s, and [45/-45o]s. Results show the superiority of the stacking sequence [0/45o]s, providing higher strength at a lower weight, and maximum energy output. Furthermore, laminated GFRP beams additionally reinforced with piezoelectric fibers can be used under bending to not only replace metallic component while providing similar strength at a lower weight but also provide an energy output.

Keywords: classical lamination theory (CLT), energy scavenging, glass fiber reinforced plastics (GFRP), piezoelectric fibers

Procedia PDF Downloads 295
6402 Understanding the Prevalence and Expression of Virulence Factors Harbored by Enterotoxigenic Escherichia Coli

Authors: Debjyoti Bhakat, Indranil Mondal, Asish K. Mukhopadayay, Nabendu S. Chatterjee

Abstract:

Enterotoxigenic Escherichia coli is one of the leading causes of diarrhea in infants and travelers in developing countries. Colonization factors play an important role in pathogenesis and are one of the main targets for Enterotoxigenic Escherichia coli (ETEC) vaccine development. However, ETEC vaccines had poorly performed in the past, as the prevalence of colonization factors is region-dependent. There are more than 25 classical colonization factors presently known to be expressed by ETEC, although all are not expressed together. Further, there are other multiple non-classical virulence factors that are also identified. Here the presence and expression of common classical and non-classical virulence factors were studied. Further studies were done on the expression of prevalent colonization factors in different strains. For the prevalence determination, multiplex polymerase chain reaction (PCR) was employed, which was confirmed by simplex PCR. Quantitative RT-PCR was done to study the RNA expression of these virulence factors. Strains negative for colonization factors expression were confirmed by SDS-PAGE. Among the clinical isolates, the most prevalent toxin was est+elt, followed by est and elt, while the pattern was reversed in the control strains. There were 29% and 40% strains negative for any classical colonization factors (CF) or non-classical virulence factors (NCVF) among the clinical and control strains, respectively. Among CF positive ETEC strains, CS6 and CS21 were the prevalent ones in the clinical strains, whereas in control strains, CS6 was the predominant one. For NCVF genes, eatA was the most prevalent among the clinical isolates and etpA for control. CS6 was the most expressed CF, and eatA was the predominantly expressed NCVF for both clinical and controlled ETEC isolates. CS6 expression was more in strains having CS6 alone. Different strains express CS6 at different levels. Not all strains expressed their respective virulence factors. Understanding the prevalent colonization factor, CS6, and its nature of expression will contribute to designing an effective vaccine against ETEC in this region of the globe. The expression pattern of CS6 also will help in examining the relatedness between the ETEC subtypes.

Keywords: classical virulence factors, CS6, diarrhea, enterotoxigenic escherichia coli, expression, non-classical virulence factors

Procedia PDF Downloads 140
6401 Extreme Value Theory Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. In this paper, the results for the reliability of diesel generator fans were calculated through Extreme Value Theory. The Extreme Value Theory is not widely used in the engineering field. Its usage is well known in other areas such as hydrology, meteorology, finance. The significance of this theory is in the fact that unlike the other statistical methods it is focused on rare and extreme values, and not on average. It should be noted that this theory is not designed exclusively for extreme events, but for extreme values in any event. Therefore, this is a great opportunity to apply the theory and test if it could be applied in this situation. The significance of the work is the calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know the time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. The results achieved in this method will show the approximation of time for which the fans will work as they should, and the percentage of probability of fans working more than certain estimated time. Extreme Value Theory can be applied not only for rare and extreme events, but for any event that has values which we can consider as extreme.

Keywords: extreme value theory, lifetime, reliability analysis, statistic, time to failure

Procedia PDF Downloads 318
6400 Effect of Specimen Thickness on Probability Distribution of Grown Crack Size in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

The fatigue crack growth is stochastic because of the fatigue behavior having an uncertainty and a randomness. Therefore, it is necessary to determine the probability distribution of a grown crack size at a specific fatigue crack propagation life for maintenance of structure as well as reliability estimation. The essential purpose of this study is to present the good probability distribution fit for the grown crack size at a specified fatigue life in a rolled magnesium alloy under different specimen thickness conditions. Fatigue crack propagation experiments are carried out in laboratory air under three conditions of specimen thickness using AZ31 to investigate a stochastic crack growth behavior. The goodness-of-fit test for probability distribution of a grown crack size under different specimen thickness conditions is performed by Anderson-Darling test. The effect of a specimen thickness on variability of a grown crack size is also investigated.

Keywords: crack size, fatigue crack propagation, magnesium alloys, probability distribution, specimen thickness

Procedia PDF Downloads 487
6399 Daily Probability Model of Storm Events in Peninsular Malaysia

Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain

Abstract:

Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.

Keywords: daily probability model, monsoon seasons, regions, storm events

Procedia PDF Downloads 331
6398 Solution of Some Boundary Value Problems of the Generalized Theory of Thermo-Piezoelectricity

Authors: Manana Chumburidze

Abstract:

We have considered a non-classical model of dynamical problems for a conjugated system of differential equations arising in thermo-piezoelectricity, which was formulated by Toupin – Mindlin. The basic concepts and the general theory of solvability for isotropic homogeneous elastic media is considered. They are worked by using the methods the Laplace integral transform, potential method and singular integral equations. Approximate solutions of mixed boundary value problems for finite domain, bounded by the some closed surface are constructed. They are solved in explicitly by using the generalized Fourier's series method.

Keywords: thermo-piezoelectricity, boundary value problems, Fourier's series, isotropic homogeneous elastic media

Procedia PDF Downloads 452