Search results for: first passage probability theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5776

Search results for: first passage probability theory

5686 Implementation of a Non-Poissonian Model in a Low-Seismicity Area

Authors: Ludivine Saint-Mard, Masato Nakajima, Gloria Senfaute

Abstract:

In areas with low to moderate seismicity, the probabilistic seismic hazard analysis frequently uses a Poisson approach, which assumes independence in time and space of events to determine the annual probability of earthquake occurrence. Nevertheless, in countries with high seismic rate, such as Japan, it is frequently use non-poissonian model which assumes that next earthquake occurrence depends on the date of previous one. The objective of this paper is to apply a non-poissonian models in a region of low to moderate seismicity to get a feedback on the following questions: can we overcome the lack of data to determine some key parameters?, and can we deal with uncertainties to apply largely this methodology on an industrial context?. The Brownian-Passage-Time model was applied to a fault located in France and conclude that even if the lack of data can be overcome with some calculations, the amount of uncertainties and number of scenarios leads to a numerous branches in PSHA, making this method difficult to apply on a large scale of low to moderate seismicity areas and in an industrial context.

Keywords: probabilistic seismic hazard, non-poissonian model, earthquake occurrence, low seismicity

Procedia PDF Downloads 16
5685 An Approach to Determine the in Transit Vibration to Fresh Produce Using Long Range Radio (LORA) Wireless Transducers

Authors: Indika Fernando, Jiangang Fei, Roger Stanely, Hossein Enshaei

Abstract:

Ever increasing demand for quality fresh produce by the consumers, had increased the gravity on the post-harvest supply chains in multi-fold in the recent years. Mechanical injury to fresh produce was a critical factor for produce wastage, especially with the expansion of supply chains, physically extending to thousands of miles. The impact of vibration damages in transit was identified as a specific area of focus which results in wastage of significant portion of the fresh produce, at times ranging from 10% to 40% in some countries. Several studies were concentrated on quantifying the impact of vibration to fresh produce, and it was a challenge to collect vibration impact data continuously due to the limitations in battery life or the memory capacity in the devices. Therefore, the study samples were limited to a stretch of the transit passage or a limited time of the journey. This may or may not give an accurate understanding of the vibration impacts encountered throughout the transit passage, which limits the accuracy of the results. Consequently, an approach which can extend the capacity and ability of determining vibration signals in the transit passage would contribute to accurately analyze the vibration damage along the post-harvest supply chain. A mechanism was developed to address this challenge, which is capable of measuring the in transit vibration continuously through the transit passage subject to a minimum acceleration threshold (0.1g). A system, consisting six tri-axel vibration transducers installed in different locations inside the cargo (produce) pallets in the truck, transmits vibration signals through LORA (Long Range Radio) technology to a central device installed inside the container. The central device processes and records the vibration signals transmitted by the portable transducers, along with the GPS location. This method enables to utilize power consumption for the portable transducers to maximize the capability of measuring the vibration impacts in the transit passage extending to days in the distribution process. The trial tests conducted using the approach reveals that it is a reliable method to measure and quantify the in transit vibrations along the supply chain. The GPS capability enables to identify the locations in the supply chain where the significant vibration impacts were encountered. This method contributes to determining the causes, susceptibility and intensity of vibration impact damages to fresh produce in the post-harvest supply chain. Extensively, the approach could be used to determine the vibration impacts not limiting to fresh produce, but for products in supply chains, which may extend from few hours to several days in transit.

Keywords: post-harvest, supply chain, wireless transducers, LORA, fresh produce

Procedia PDF Downloads 234
5684 'Call Drop': A Problem for Handover Minimizing the Call Drop Probability Using Analytical and Statistical Method

Authors: Anshul Gupta, T. Shankar

Abstract:

In this paper, we had analyzed the call drop to provide a good quality of service to user. By optimizing it we can increase the coverage area and also the reduction of interference and congestion created in a network. Basically handover is the transfer of call from one cell site to another site during a call. Here we have analyzed the whole network by two method-statistic model and analytic model. In statistic model we have collected all the data of a network during busy hour and normal 24 hours and in analytic model we have the equation through which we have to find the call drop probability. By avoiding unnecessary handovers we can increase the number of calls per hour. The most important parameter is co-efficient of variation on which the whole paper discussed.

Keywords: coefficient of variation, mean, standard deviation, call drop probability, handover

Procedia PDF Downloads 455
5683 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 154
5682 Probability Model Accidents of Motorcyclist Based on Driver's Personality

Authors: Margareth E. Bolla, Ludfi Djakfar, Achmad Wicaksono

Abstract:

The increase in the number of motorcycle users in Indonesia is in line with the increase in accidents involving motorcycles. Several previous studies have shown that humans are the biggest factor causing accidents, and the driver's personality factor will affect his behavior on the road. This study was conducted to see how a person's personality traits will affect the probability of having an accident while driving. The Big Five Inventory (BFI) questionnaire and the Honda Riding Trainer (HRT) simulator were used as measuring tools, while the analysis carried out was logistic regression analysis. The results of the descriptive analysis of the respondent's personality based on the BFI show that the majority of drivers have the dominant character of neuroticism (34%), while the smallest group is the driver with the dominant type of openness character (6%). The percentage of motorists who were not involved in an accident was 54%. The results of the logistic regression analysis form a mathematical model as follows Y = -3.852 - 0.288 X1 + 0.596 X2 + 0.429 X3 - 0.386 X4 - 0.094 X5 + 0.436 X6 + 0.162 X7, where the results of hypothesis testing indicate that the variables openness, conscientiousness, extraversion, agreeableness, neuroticism, history of traffic accidents and age at starting driving did not have a significant effect on the probability of a motorcyclist being involved in an accident.

Keywords: accidents, BFI, probability, simulator

Procedia PDF Downloads 114
5681 Evaluation of DNA Paternity Testing Accuracy of Child Trafficking Cases

Authors: Wing Kam Fung, Kexin Yu

Abstract:

Child trafficking has been a serious problem in modern China. The Chinese government has established a national anti-trafficking DNA database to help reunite missing children with their families. The database collects DNA information from missing children's parents, trafficked and homeless children, then conducts paternity tests to find matched pairs. This paper considers the matching accuracy in such cases by looking into the exclusion probability in paternity testing. First, the situation of child trafficking in China is introduced. Next, derivations of the exclusion probability for both one-parent and two-parents cases are given, followed by extension to allow for 1 or 2 mutations. The accuracy of paternity testing of child trafficking cases is then assessed using the exclusion probabilities and available data. Finally, the number of loci that should be used to ensure a correct match is investigated.

Keywords: child trafficking, DNA database, exclusion probability, paternity testing

Procedia PDF Downloads 422
5680 Identifying Chaotic Architecture: Origins of Nonlinear Design Theory

Authors: Mohammadsadegh Zanganehfar

Abstract:

Since the modernism, movement, and appearance of modern architecture, an aggressive desire for a general design theory in the theoretical works of architects in the form of books and essays emerges. Since Robert Venturi and Denise Scott Brown’s published complexity and contradiction in architecture in 1966, the discourse of complexity and volumetric composition has been an important and controversial issue in the discipline. Ever since various theories and essays were involved in this discourse, this paper attempt to identify chaos theory as a scientific model of complexity and its relation to architecture design theory by conducting a qualitative analysis and multidisciplinary critical approach through architecture and basic sciences resources. As a result, we identify chaotic architecture as the correlation of chaos theory and architecture as an independent nonlinear design theory with specific characteristics and properties.

Keywords: architecture complexity, chaos theory, fractals, nonlinear dynamic systems, nonlinear ontology

Procedia PDF Downloads 337
5679 Hybrid EMPCA-Scott Approach for Estimating Probability Distributions of Mutual Information

Authors: Thuvanan Borvornvitchotikarn, Werasak Kurutach

Abstract:

Mutual information (MI) is widely used in medical image registration. In the different medical images analysis, it is difficult to choose an optimal bins size number for calculating the probability distributions in MI. As the result, this paper presents a new adaptive bins number selection approach that named a hybrid EMPCA-Scott approach. This work combines an expectation maximization principal component analysis (EMPCA) and the modified Scott’s rule. The proposed approach solves the binning problem from the various intensity values in medical images. Experimental results of this work show the lower registration errors compared to other adaptive binning approaches.

Keywords: mutual information, EMPCA, Scott, probability distributions

Procedia PDF Downloads 218
5678 Performance of Nakagami Fading Channel over Energy Detection Based Spectrum Sensing

Authors: M. Ranjeeth, S. Anuradha

Abstract:

Spectrum sensing is the main feature of cognitive radio technology. Spectrum sensing gives an idea of detecting the presence of the primary users in a licensed spectrum. In this paper we compare the theoretical results of detection probability of different fading environments like Rayleigh, Rician, Nakagami-m fading channels with the simulation results using energy detection based spectrum sensing. The numerical results are plotted as P_f Vs P_d for different SNR values, fading parameters. It is observed that Nakagami fading channel performance is better than other fading channels by using energy detection in spectrum sensing. A MATLAB simulation test bench has been implemented to know the performance of energy detection in different fading channel environment.

Keywords: spectrum sensing, energy detection, fading channels, probability of detection, probability of false alarm

Procedia PDF Downloads 495
5677 Young’s Modulus Variability: Influence on Masonry Vault Behavior

Authors: Abdelmounaim Zanaz, Sylvie Yotte, Fazia Fouchal, Alaa Chateauneuf

Abstract:

This paper presents a methodology for probabilistic assessment of bearing capacity and prediction of failure mechanism of masonry vaults at the ultimate state with consideration of the natural variability of Young’s modulus of stones. First, the computation model is explained. The failure mode is the most reported mode, i.e. the four-hinge mechanism. Based on this assumption, the study of a vault composed of 16 segments is presented. The Young’s modulus of the segments is considered as random variable defined by a mean value and a coefficient of variation CV. A relationship linking the vault bearing capacity to the modulus variation of voussoirs is proposed. The failure mechanisms, in addition to that observed in the deterministic case, are identified for each CV value as well as their probability of occurrence. The results show that the mechanism observed in the deterministic case has decreasing probability of occurrence in terms of CV, while the number of other mechanisms and their probability of occurrence increase with the coefficient of variation of Young’s modulus. This means that if a significant change in the Young modulus of the segments is proven, taken it into account in computations becomes mandatory, both for determining the vault bearing capacity and for predicting its failure mechanism.

Keywords: masonry, mechanism, probability, variability, vault

Procedia PDF Downloads 413
5676 An Exploratory Study on 'Sub-Region Life Circle' in Chinese Big Cities Based on Human High-Probability Daily Activity: Characteristic and Formation Mechanism as a Case of Wuhan

Authors: Zhuoran Shan, Li Wan, Xianchun Zhang

Abstract:

With an increasing trend of regionalization and polycentricity in Chinese contemporary big cities, “sub-region life circle” turns to be an effective method on rational organization of urban function and spatial structure. By the method of questionnaire, network big data, route inversion on internet map, GIS spatial analysis and logistic regression, this article makes research on characteristic and formation mechanism of “sub-region life circle” based on human high-probability daily activity in Chinese big cities. Firstly, it shows that “sub-region life circle” has been a new general spatial sphere of residents' high-probability daily activity and mobility in China. Unlike the former analysis of the whole metropolitan or the micro community, “sub-region life circle” has its own characteristic on geographical sphere, functional element, spatial morphology and land distribution. Secondly, according to the analysis result with Binary Logistic Regression Model, the research also shows that seven factors including land-use mixed degree and bus station density impact the formation of “sub-region life circle” most, and then analyzes the index critical value of each factor. Finally, to establish a smarter “sub-region life circle”, this paper indicates that several strategies including jobs-housing fit, service cohesion and space reconstruction are the keys for its spatial organization optimization. This study expands the further understanding of cities' inner sub-region spatial structure based on human daily activity, and contributes to the theory of “life circle” in urban's meso-scale.

Keywords: sub-region life circle, characteristic, formation mechanism, human activity, spatial structure

Procedia PDF Downloads 266
5675 A Strategy for the Application of Second-Order Monte Carlo Algorithms to Petroleum Exploration and Production Projects

Authors: Obioma Uche

Abstract:

Due to the recent volatility in oil & gas prices as well as increased development of non-conventional resources, it has become even more essential to critically evaluate the profitability of petroleum prospects prior to making any investment decisions. Traditionally, simple Monte Carlo (MC) algorithms have been used to randomly sample probability distributions of economic and geological factors (e.g. price, OPEX, CAPEX, reserves, productive life, etc.) in order to obtain probability distributions for profitability metrics such as Net Present Value (NPV). In recent years, second-order MC algorithms have been shown to offer an advantage over simple MC techniques due to the added consideration of uncertainties associated with the probability distributions of the relevant variables. Here, a strategy for the application of the second-order MC technique to a case study is demonstrated to analyze its effectiveness as a tool for portfolio management.

Keywords: Monte Carlo algorithms, portfolio management, profitability, risk analysis

Procedia PDF Downloads 294
5674 Modeling Binomial Dependent Distribution of the Values: Synthesis Tables of Probabilities of Errors of the First and Second Kind of Biometrics-Neural Network Authentication System

Authors: B. S.Akhmetov, S. T. Akhmetova, D. N. Nadeyev, V. Yu. Yegorov, V. V. Smogoonov

Abstract:

Estimated probabilities of errors of the first and second kind for nonideal biometrics-neural transducers 256 outputs, the construction of nomograms based error probability of 'own' and 'alien' from the mathematical expectation and standard deviation of the normalized measures Hamming.

Keywords: modeling, errors, probability, biometrics, neural network, authentication

Procedia PDF Downloads 455
5673 Random Access in IoT Using Naïve Bayes Classification

Authors: Alhusein Almahjoub, Dongyu Qiu

Abstract:

This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.

Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation

Procedia PDF Downloads 120
5672 Econophysics: The Use of Entropy Measures in Finance

Authors: Muhammad Sheraz, Vasile Preda, Silvia Dedu

Abstract:

Concepts of econophysics are usually used to solve problems related to uncertainty and nonlinear dynamics. In the theory of option pricing the risk neutral probabilities play very important role. The application of entropy in finance can be regarded as the extension of both information entropy and the probability entropy. It can be an important tool in various financial methods such as measure of risk, portfolio selection, option pricing and asset pricing. Gulko applied Entropy Pricing Theory (EPT) for pricing stock options and introduced an alternative framework of Black-Scholes model for pricing European stock option. In this article, we present solutions to maximum entropy problems based on Tsallis, Weighted-Tsallis, Kaniadakis, Weighted-Kaniadakies entropies, to obtain risk-neutral densities. We have also obtained the value of European call and put in this framework.

Keywords: option pricing, Black-Scholes model, Tsallis entropy, Kaniadakis entropy, weighted entropy, risk-neutral density

Procedia PDF Downloads 266
5671 Domain Adaptive Dense Retrieval with Query Generation

Authors: Rui Yin, Haojie Wang, Xun Li

Abstract:

Recently, mainstream dense retrieval methods have obtained state-of-the-art results on some datasets and tasks. However, they require large amounts of training data, which is not available in most domains. The severe performance degradation of dense retrievers on new data domains has limited the use of dense retrieval methods to only a few domains with large training datasets. In this paper, we propose an unsupervised domain-adaptive approach based on query generation. First, a generative model is used to generate relevant queries for each passage in the target corpus, and then, the generated queries are used for mining negative passages. Finally, the query-passage pairs are labeled with a cross-encoder and used to train a domain-adapted dense retriever. We also explore contrastive learning as a method for training domain-adapted dense retrievers and show that it leads to strong performance in various retrieval settings. Experiments show that our approach is more robust than previous methods in target domains that require less unlabeled data.

Keywords: dense retrieval, query generation, contrastive learning, unsupervised training

Procedia PDF Downloads 57
5670 Optimal Mitigation of Slopes by Probabilistic Methods

Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez

Abstract:

A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.

Keywords: expected life-cycle cost, failure probability, slopes failure, storms

Procedia PDF Downloads 125
5669 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem

Authors: Bidzina Matsaberidze

Abstract:

It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.

Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions

Procedia PDF Downloads 59
5668 Inverse Matrix in the Theory of Dynamical Systems

Authors: Renata Masarova, Bohuslava Juhasova, Martin Juhas, Zuzana Sutova

Abstract:

In dynamic system theory a mathematical model is often used to describe their properties. In order to find a transfer matrix of a dynamic system we need to calculate an inverse matrix. The paper contains the fusion of the classical theory and the procedures used in the theory of automated control for calculating the inverse matrix. The final part of the paper models the given problem by the Matlab.

Keywords: dynamic system, transfer matrix, inverse matrix, modeling

Procedia PDF Downloads 479
5667 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object

Procedia PDF Downloads 199
5666 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode

Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan

Abstract:

Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.

Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk

Procedia PDF Downloads 403
5665 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 241
5664 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 45
5663 Chinese Fantasy Novel: New Word Teaching for Non-Native Learners

Authors: Bok Check Meng, Goh Ying Soon

Abstract:

Giving additional learning materials such as Chinese fantasy novel to non-native learners can be strenuous. Instructors have to understand the underpinning theories about cognitive theory for new word instruction. This paper discusses the underpinning theories. Relevant literature reviews are given. There are basically five major areas of cognitive related theories mentioned in this article. These include motivational learning theory, Affective theory of learning, Cognitive psychology theory, Vocabulary acquisition theory and Bloom’s cognitive levels theory. A theoretical framework has been constructed. Thus, this will give a hand in ensuring non-native learners might gain positive outcomes in the instruction process. Instructors who are interested in teaching new word from Chinese fantasy novel in specific to support additional learning might be able to get insights from this article.

Keywords: Chinese fantasy novel, new word teaching, non-native learners, cognitive theory, bloom

Procedia PDF Downloads 698
5662 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 129
5661 Phase Control in Population Inversion Using Chirped Laser

Authors: Avijit Datta

Abstract:

We have presented a phase control scheme in population transfer using chirped laser fields. A chirped pulse can do population transfer from one level to another level via adiabatic rapid passage accessible by one photon dipole transition. We propose to use a pair of phase-locked chirped pulses of the same frequency w(t) instead of a singly chirped-pulse frequency w(t). Simultaneous action of phase controlled interference in addition to rapid adiabatic passages due to chirped pulses lead to phase control over this population transfer dynamics. We have demonstrated the proposed phase control scheme over the population distribution from the initial level X(v=0,j=0) to C(v=2,j=1) level of hydrogen molecule using a pair of phase-locked and similarly chirped laser pulses. We have extended this two-level system to three-level 1+1 ladder system of hydrogen molecule from X level to final J(v=2,j=2) level via C intermediate level using two pairs of laser pulses having frequencies w(t) and w'(t) respectively and obtained laudable control over the population distribution among three levels. We also have presented some results of interference effects of w₁(t) and its third harmonics w₃(t).

Keywords: phase control, population transfer, chirped laser pulses, rapid adiabatic passage, laser-molecule interaction

Procedia PDF Downloads 327
5660 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs

Authors: Lokesh Varshney, R. K. Saket

Abstract:

This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.

Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation

Procedia PDF Downloads 532
5659 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 322
5658 Contextualizing Theory Z of Motivation Among Indian Universities of Higher Education

Authors: Janani V., Tanika Singh, Bala Subramanian R., Santosh Kumar Sharma

Abstract:

Higher education across the globe is undergoing a sea change. This has created a varied management of higher education in Indian universities, and therefore, we find no universal law regarding HR policies and practices in these universities. As a result, faculty retention is very low, which is a serious concern for educational leaders such as vice-chancellors or directors working in the higher education sector. We can understand this phenomenon in the light of various management theories, among which theory z proposed by William Ouchi is a prominent one. With this backdrop, the present article strives to contextualize theory z in Indian higher education. For the said purpose, qualitative methodology has been adopted, and accordingly, propositions have been generated. We believe that this article will motivate other researchers to empirically test the generated propositions and thereby contribute in the existing literature.

Keywords: education, managemenet, motivation, Theory X, Theory Y, Theory Z, faculty members, universities, India

Procedia PDF Downloads 49
5657 Logic of the Prospect Theory: The Decision Making Process of the First Gulf War and the Crimean Annexation

Authors: Zhengyang Ma, Zhiyao Li, Jiayi Zhang

Abstract:

This article examines the prospect theory’s arguments about decision-making through two case studies, the First Gulf War and Russia’s annexation of Crimea. The article uses the methods of comparative case analysis and process tracing to investigate the prospect theory’s fundamental arguments. Through evidence derived from existing primary and secondary sources, this paper argues that both former U.S. President Bush and Russian President Putin viewed their situations as a domain of loss and made risky decisions to prevent further deterioration, which attests the arguments of the prospect theory. After the two case studies, this article also discusses how the prospect theory could be used in analyzing the decision-making process that led to the current Russia-Ukraine War.

Keywords: the prospect theory, international relations, the first gulf war, the crimea crisis

Procedia PDF Downloads 82