Search results for: classical theory of probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6554

Search results for: classical theory of probability

5984 Dark Gravity Confronted with Supernovae, Baryonic Oscillations and Cosmic Microwave Background Data

Authors: Frederic Henry-Couannier

Abstract:

Dark Gravity is a natural extension of general relativity in presence of a flat non dynamical background. Matter and radiation fields from its dark sector, as soon as their gravity dominates over our side fields gravity, produce a constant acceleration law of the scale factor. After a brief reminder of the Dark Gravity theory foundations, the confrontation with the main cosmological probes is carried out. We show that, amazingly, the sudden transition between the usual matter dominated decelerated expansion law a(t) ∝ t²/³ and this accelerated expansion law a(t) ∝ t² predicted by the theory should be able to fit the main cosmological probes (SN, BAO, CMB and age of the oldest stars data) but also direct H₀ measurements with two free parameters only: H₀ and the transition redshift.

Keywords: anti-gravity, negative energies, time reversal, field discontinuities, dark energy theory

Procedia PDF Downloads 49
5983 The Optimal Order Policy for the Newsvendor Model under Worker Learning

Authors: Sunantha Teyarachakul

Abstract:

We consider the worker-learning Newsvendor Model, under the case of lost-sales for unmet demand, with the research objective of proposing the cost-minimization order policy and lot size, scheduled to arrive at the beginning of the selling-period. In general, the New Vendor Model is used to find the optimal order quantity for the perishable items such as fashionable products or those with seasonal demand or short-life cycles. Technically, it is used when the product demand is stochastic and available for the single selling-season, and when there is only a one time opportunity for the vendor to purchase, with possibly of long ordering lead-times. Our work differs from the classical Newsvendor Model in that we incorporate the human factor (specifically worker learning) and its influence over the costs of processing units into the model. We describe this by using the well-known Wright’s Learning Curve. Most of the assumptions of the classical New Vendor Model are still maintained in our work, such as the constant per-unit cost of leftover and shortage, the zero initial inventory, as well as the continuous time. Our problem is challenging in the way that the best order quantity in the classical model, which is balancing the over-stocking and under-stocking costs, is no longer optimal. Specifically, when adding the cost-saving from worker learning to such expected total cost, the convexity of the cost function will likely not be maintained. This has called for a new way in determining the optimal order policy. In response to such challenges, we found a number of characteristics related to the expected cost function and its derivatives, which we then used in formulating the optimal ordering policy. Examples of such characteristics are; the optimal order quantity exists and is unique if the demand follows a Uniform Distribution; if the demand follows the Beta Distribution with some specific properties of its parameters, the second derivative of the expected cost function has at most two roots; and there exists the specific level of lot size that satisfies the first order condition. Our research results could be helpful for analysis of supply chain coordination and of the periodic review system for similar problems.

Keywords: inventory management, Newsvendor model, order policy, worker learning

Procedia PDF Downloads 412
5982 Modeling of Glycine Transporters in Mammalian Using the Probability Approach

Authors: K. S. Zaytsev, Y. R. Nartsissov

Abstract:

Glycine is one of the key inhibitory neurotransmitters in Central nervous system (CNS) meanwhile glycinergic transmission is highly dependable on its appropriate reuptake from synaptic cleft. Glycine transporters (GlyT) of types 1 and 2 are the enzymes providing glycine transport back to neuronal and glial cells along with Na⁺ and Cl⁻ co-transport. The distribution and stoichiometry of GlyT1 and GlyT2 differ in details, and GlyT2 is more interesting for the research as it reuptakes glycine to neuron cells, whereas GlyT1 is located in glial cells. In the process of GlyT2 activity, the translocation of the amino acid is accompanied with binding of both one chloride and three sodium ions consequently (two sodium ions for GlyT1). In the present study, we developed a computer simulator of GlyT2 and GlyT1 activity based on known experimental data for quantitative estimation of membrane glycine transport. The trait of a single protein functioning was described using the probability approach where each enzyme state was considered separately. Created scheme of transporter functioning realized as a consequence of elemental steps allowed to take into account each event of substrate association and dissociation. Computer experiments using up-to-date kinetic parameters allowed receiving the number of translocated glycine molecules, Na⁺ and Cl⁻ ions per time period. Flexibility of developed software makes it possible to evaluate glycine reuptake pattern in time under different internal characteristics of enzyme conformational transitions. We investigated the behavior of the system in a wide range of equilibrium constant (from 0.2 to 100), which is not determined experimentally. The significant influence of equilibrium constant in the range from 0.2 to 10 on the glycine transfer process is shown. The environmental conditions such as ion and glycine concentrations are decisive if the values of the constant are outside the specified range.

Keywords: glycine, inhibitory neurotransmitters, probability approach, single protein functioning

Procedia PDF Downloads 112
5981 Probability Sampling in Matched Case-Control Study in Drug Abuse

Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell

Abstract:

Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.

Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling

Procedia PDF Downloads 488
5980 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables

Procedia PDF Downloads 330
5979 The Philosophy of Language Theory in the Standard Malay Primary School Curriculum in Malaysia

Authors: Mohd Rashid Bin Hj. Md Idris, Lajiman Bin Janoory, Abdullah Bin Yusof, Mahzir Bin Ibrahim

Abstract:

The Malay language curriculum at primary school level in Malaysia is instrumental in ensuring the status of the language as the official and national language, the language of instruction as well as the language that unites the various ethnics in Malaysia. A research addressing issues related to the curriculum standard is, therefore, essential to provide value added quality to the existing National Education Philosophy in ongoing efforts to produce an individual who is balanced in intellectual, spiritual, emotional and physical developments. The objective of this study is to examine the Philosophy of Language Theory, to review the content of the Malay language subject in relation to the Standard Curriculum for Primary Schools (KSSR), and to identify aspects of Theory of Philosophy in the Standard Curriculum for Primary Schools. The Malay language Primary School Curriculum is designed to enable students to be competent speakers and communicators of the language in order to gain knowledge, skills, information, values, and ideas and to enhance skills in social relations. Therefore, this study is designed to help educators to achieve all the stated goals. At the same time students at primary school level are expected to be able to apply the principle of language perfection as stated in the Philosophy of Language Theory to enable them to understand, appreciate and to take pride in being a Malaysian who speaks the language well.

Keywords: language, philosophy, theory, curriculum, standard, national education philosophy

Procedia PDF Downloads 590
5978 CE Method for Development of Japan's Stochastic Earthquake Catalogue

Authors: Babak Kamrani, Nozar Kishi

Abstract:

Stochastic catalog represents the events module of the earthquake loss estimation models. It includes series of events with different magnitudes and corresponding frequencies/probabilities. For the development of the stochastic catalog, random or uniform sampling methods are used to sample the events from the seismicity model. For covering all the Magnitude Frequency Distribution (MFD), a huge number of events should be generated for the above-mentioned methods. Characteristic Event (CE) method chooses the events based on the interest of the insurance industry. We divide the MFD of each source into bins. We have chosen the bins based on the probability of the interest by the insurance industry. First, we have collected the information for the available seismic sources. Sources are divided into Fault sources, subduction, and events without specific fault source. We have developed the MFD for each of the individual and areal source based on the seismicity of the sources. Afterward, we have calculated the CE magnitudes based on the desired probability. To develop the stochastic catalog, we have introduced uncertainty to the location of the events too.

Keywords: stochastic catalogue, earthquake loss, uncertainty, characteristic event

Procedia PDF Downloads 288
5977 Error Probability of Multi-User Detection Techniques

Authors: Komal Babbar

Abstract:

Multiuser Detection is the intelligent estimation/demodulation of transmitted bits in the presence of Multiple Access Interference. The authors have presented the Bit-error rate (BER) achieved by linear multi-user detectors: Matched filter (which treats the MAI as AWGN), Decorrelating and MMSE. In this work, authors investigate the bit error probability analysis for Matched filter, decorrelating, and MMSE. This problem arises in several practical CDMA applications where the receiver may not have full knowledge of the number of active users and their signature sequences. In particular, the behavior of MAI at the output of the Multi-user detectors (MUD) is examined under various asymptotic conditions including large signal to noise ratio; large near-far ratios; and a large number of users. In the last section Authors also shows Matlab Simulation results for Multiuser detection techniques i.e., Matched filter, Decorrelating, MMSE for 2 users and 10 users.

Keywords: code division multiple access, decorrelating, matched filter, minimum mean square detection (MMSE) detection, multiple access interference (MAI), multiuser detection (MUD)

Procedia PDF Downloads 521
5976 Hydraulic Characteristics of Mine Tailings by Metaheuristics Approach

Authors: Akhila Vasudev, Himanshu Kaushik, Tadikonda Venkata Bharat

Abstract:

A large number of mine tailings are produced every year as part of the extraction process of phosphates, gold, copper, and other materials. Mine tailings are high in water content and have very slow dewatering behavior. The efficient design of tailings dam and economical disposal of these slurries requires the knowledge of tailings consolidation behavior. The large-strain consolidation theory closely predicts the self-weight consolidation of these slurries as the theory considers the conservation of mass and momentum conservation and considers the hydraulic conductivity as a function of void ratio. Classical laboratory techniques, such as settling column test, seepage consolidation test, etc., are expensive and time-consuming for the estimation of hydraulic conductivity variation with void ratio. Inverse estimation of the constitutive relationships from the measured settlement versus time curves is explored. In this work, inverse analysis based on metaheuristics techniques will be explored for predicting the hydraulic conductivity parameters for mine tailings from the base excess pore water pressure dissipation curve and the initial conditions of the mine tailings. The proposed inverse model uses particle swarm optimization (PSO) algorithm, which is based on the social behavior of animals searching for food sources. The finite-difference numerical solution of the forward analytical model is integrated with the PSO algorithm to solve the inverse problem. The method is tested on synthetic data of base excess pore pressure dissipation curves generated using the finite difference method. The effectiveness of the method is verified using base excess pore pressure dissipation curve obtained from a settling column experiment and further ensured through comparison with available predicted hydraulic conductivity parameters.

Keywords: base excess pore pressure, hydraulic conductivity, large strain consolidation, mine tailings

Procedia PDF Downloads 129
5975 Numerical Analysis of the Coanda Effect on the Classical Interior Ejectors

Authors: Alexandru Dumitrache, Florin Frunzulica, Octavian Preotu

Abstract:

The flow mitigation detachment problem near solid surfaces, resulting in improved globally aerodynamic performance by exploiting the Coanda effect on surfaces, has been addressed extensively in the literature, since 1940. The research is carried on and further developed, using modern means of calculation and new experimental methods. In this paper, it is shown interest in the detailed behavior of a classical interior ejector assisted by the Coanda effect, used in propulsion systems. For numerical investigations, an implicit formulation of RANS equations for axisymmetric flow with a shear stress transport k- ω (SST model) turbulence model is used. The obtained numerical results emphasize the efficiency of the ejector, depending on the physical parameters of the flow and the geometric configuration. Furthermore, numerical investigations are carried out regarding the evolution of the Reynolds number when the jet is attached to the wall, considering three geometric configurations: sudden expansion, open cavity and sudden expansion with divergent at the inlet. Therefore, further insight into complexities involving issues such as the variety of flow structure and the related bifurcation and flow instabilities are provided. Thus, the conditions and the limits within which one can benefit from the advantages of Coanda-type flows are determined.

Keywords: Coanda effect, Coanda ejector, CFD, stationary bifurcation, sudden expansion

Procedia PDF Downloads 209
5974 Mathematics Anxiety among Male and Female Students

Authors: Wern Lin Yeo, Choo Kim Tan, Sook Ling Lew

Abstract:

Mathematics anxiety refers to the feeling of anxious when one having difficulties in solving mathematical problem. Mathematics anxiety is the most common type of anxiety among other types of anxiety which occurs among the students. However, level of anxiety among males and females are different. There were few past study were conducted to determine the relationship of anxiety and gender but there were still did not have an exact results. Hence, the purpose of this study is to determine the relationship of anxiety level between male and female undergraduates at a private university in Malaysia. Convenient sampling method used in this study in which the students were selected based on the grouping assigned by the faculty. There were 214 undergraduates who registered the probability courses had participated in this study. Mathematics Anxiety Rating Scale (MARS) was the instrument used in study which used to determine students’ anxiety level towards probability. Reliability and validity of instrument was done before the major study was conducted. In the major study, students were given briefing about the study conducted. Participation of this study were voluntary. Students were given consent form to determine whether they agree to participate in the study. Duration of two weeks were given for students to complete the given online questionnaire. The data collected will be analyzed using Statistical Package for the Social Sciences (SPSS) to determine the level of anxiety. There were three anxiety level, i.e., low, average and high. Students’ anxiety level were determined based on their scores obtained compared with the mean and standard deviation. If the scores obtained were below mean and standard deviation, the anxiety level was low. If the scores were at below and above the mean and between one standard deviation, the anxiety level was average. If the scores were above the mean and greater than one standard deviation, the anxiety level was high. Results showed that both of the gender were having average anxiety level. Males having high frequency of three anxiety level which were low, average and high anxiety level as compared to females. Hence, the mean values obtained for males (M = 3.62) was higher than females (M = 3.42). In order to be significant of anxiety level among the gender, the p-value should be less than .05. The p-value obtained in this study was .117. However, this value was greater than .05. Thus, there was no significant difference of anxiety level among the gender. In other words, there was no relationship of anxiety level with the gender.

Keywords: anxiety level, gender, mathematics anxiety, probability and statistics

Procedia PDF Downloads 288
5973 Two Brazilian Medeas: The Cases of Mata Teu Pai and Medeia Negra

Authors: Jaqueline Bohn Donada

Abstract:

The significance of Euripides’ Medea for contemporary literature is noticeable. Even if the bulk of Classical Reception studies does not tend to look carefully and consistently to the literature produced outside the Anglophone world, Brazilian literature offers abundant materials for such studies. Indeed, a certain Classical background can be observed in Brazilian literature at least since 1975 when Gota d’Água [The Final Straw, in English], a play that recreates the story of Medea and sets it in a favela in Rio de Janeiro. Also worthy of notice is Ivo Bender’s Trilogia Perversa [Perverse Trilogy, in English], a series of three historical plays set in Southern Brazil and based on Aeschylus’ Oresteia and on Euripides’ Iphigenia in Aulis published in the 1980s. Since then, a number of works directly inspired by the plays of Aeschylus, Sophocles and Euripides have been published, not to mention several adaptations of Homer’s two epic poems. This paper proposes a comparative analysis of two such works: Grace Passô’s 2017 play Mata teu Pai [Kill your father, in English] and Marcia Lima’s 2019 play Medeia Negra [Black Medea, in English] from the perspective of Classical Reception Studies in an intersection with feminist literary criticism. The paper intends to look at the endurance of Euripides’ character in contemporary Brazilian literature with a focus on how the character seems to have acquired special relevance to the treatment of pressing issues of the twenty-first century. Whereas Grace Passô’s play sets Medea at the center of a group of immigrant women, Marcia Limma has the character enact the dilemmas of incarcerated women in Brazil. The hypothesis that this research aims at testing is that both artists preserve the pathos of Euripides’s original character at the same time that they recreate his Medea in concrete circumstances of Brazilian contemporary social reality. At the end, the research aims at stating the significance of the Medea theme to contemporary Brazilian literature.

Keywords: Euripides, Medea, Grace Passô, Marcia Limma, Brazilian literature

Procedia PDF Downloads 126
5972 A Recognition Method of Ancient Yi Script Based on Deep Learning

Authors: Shanxiong Chen, Xu Han, Xiaolong Wang, Hui Ma

Abstract:

Yi is an ethnic group mainly living in mainland China, with its own spoken and written language systems, after development of thousands of years. Ancient Yi is one of the six ancient languages in the world, which keeps a record of the history of the Yi people and offers documents valuable for research into human civilization. Recognition of the characters in ancient Yi helps to transform the documents into an electronic form, making their storage and spreading convenient. Due to historical and regional limitations, research on recognition of ancient characters is still inadequate. Thus, deep learning technology was applied to the recognition of such characters. Five models were developed on the basis of the four-layer convolutional neural network (CNN). Alpha-Beta divergence was taken as a penalty term to re-encode output neurons of the five models. Two fully connected layers fulfilled the compression of the features. Finally, at the softmax layer, the orthographic features of ancient Yi characters were re-evaluated, their probability distributions were obtained, and characters with features of the highest probability were recognized. Tests conducted show that the method has achieved higher precision compared with the traditional CNN model for handwriting recognition of the ancient Yi.

Keywords: recognition, CNN, Yi character, divergence

Procedia PDF Downloads 160
5971 Belief-Based Games: An Appropriate Tool for Uncertain Strategic Situation

Authors: Saied Farham-Nia, Alireza Ghaffari-Hadigheh

Abstract:

Game theory is a mathematical tool to study the behaviors of a rational and strategic decision-makers, that analyze existing equilibrium in interest conflict situation and provides an appropriate mechanisms for cooperation between two or more player. Game theory is applicable for any strategic and interest conflict situation in politics, management and economics, sociology and etc. Real worlds’ decisions are usually made in the state of indeterminacy and the players often are lack of the information about the other players’ payoffs or even his own, which leads to the games in uncertain environments. When historical data for decision parameters distribution estimation is unavailable, we may have no choice but to use expertise belief degree, which represents the strength with that we believe the event will happen. To deal with belief degrees, we have use uncertainty theory which is introduced and developed by Liu based on normality, duality, subadditivity and product axioms to modeling personal belief degree. As we know, the personal belief degree heavily depends on the personal knowledge concerning the event and when personal knowledge changes, cause changes in the belief degree too. Uncertainty theory not only theoretically is self-consistent but also is the best among other theories for modeling belief degree on practical problem. In this attempt, we primarily reintroduced Expected Utility Function in uncertainty environment according to uncertainty theory axioms to extract payoffs. Then, we employed Nash Equilibrium to investigate the solutions. For more practical issues, Stackelberg leader-follower Game and Bertrand Game, as a benchmark models are discussed. Compared to existing articles in the similar topics, the game models and solution concepts introduced in this article can be a framework for problems in an uncertain competitive situation based on experienced expert’s belief degree.

Keywords: game theory, uncertainty theory, belief degree, uncertain expected value, Nash equilibrium

Procedia PDF Downloads 411
5970 A Joint Possibilistic-Probabilistic Tool for Load Flow Uncertainty Assessment-Part I: Formulation

Authors: Morteza Aien, Masoud Rashidinejad, Mahmud Fotuhi-Firuzabad

Abstract:

As energetic and environmental issues are getting more and more attention all around the world, the penetration of distributed energy resources (DERs) mainly those harvesting renewable energies (REs) ascends with an unprecedented rate. This matter causes more uncertainties to appear in the power system context; ergo, the uncertainty analysis of the system performance is an obligation. The uncertainties of any system can be represented probabilistically or possibilistically. Since sufficient historical data about all the system variables is not available, therefore, they do not have a probability density function (PDF) and must be represented possibilistiacally. When some of system uncertain variables are probabilistic and some are possibilistic, neither the conventional pure probabilistic nor pure possibilistic methods can be implemented. Hence, a combined solution is appealed. The first of this two-paper series formulates a new possibilistic-probabilistic tool for the load flow uncertainty assessment. The proposed methodology is based on the evidence theory and joint propagation of possibilistic and probabilistic uncertainties. This possibilistic- probabilistic formulation is solved in the second companion paper in an uncertain load flow (ULF) study problem.

Keywords: probabilistic uncertainty modeling, possibilistic uncertainty modeling, uncertain load flow, wind turbine generator

Procedia PDF Downloads 558
5969 Theorical Studies on the Structural Properties of 2,3-Bis(Furan-2-Yl)Pyrazino[2,3-F][1,10]Phenanthroline Derivaties

Authors: Zahra Sadeghian

Abstract:

This paper reports on the geometrical parameters optimized of the stationary point for the 2,3-Bis(furan-2-yl)pyrazino[2,3-f][1,10]phenanthroline. The calculations are performed using density functional theory (DFT) method at the B3LYP/LanL2DZ level. We determined bond lengths and bond angles values for the compound and calculate the amount of bond hybridization according to the natural bond orbital theory (NBO) too. The energy of frontier orbital (HOMO and LUMO) are computed. In addition, calculated data are accurately compared with the experimental result. This comparison show that the our theoretical data are in reasonable agreement with the experimental values.

Keywords: 2, 3-Bis(furan-2-yl)pyrazino[2, 3-f][1, 10]phenanthroline, density functional theory, theorical calculations, LanL2DZ level, B3LYP level

Procedia PDF Downloads 367
5968 Early Talent Identification and Its Impact on Children’s Growth and Development: An Examination of “The Social Learning Theory, by Albert Bandura"

Authors: Michael Subbey, Kwame Takyi Danquah

Abstract:

Finding a child's exceptional skills and abilities at a young age and nurturing them is a challenging process. The Social Learning Theory (SLT) of Albert Bandura is used to analyze the effects of early talent identification on children's growth and development. The study examines both the advantages and disadvantages of early talent identification and stresses the significance of a moral strategy that puts the welfare of the child first. The paper emphasizes the value of a balanced approach to early talent identification that takes into account individual differences, cultural considerations, and the child's social environment.

Keywords: early talent development, social learning theory, child development, child welfare

Procedia PDF Downloads 101
5967 An Analysis of a Queueing System with Heterogeneous Servers Subject to Catastrophes

Authors: M. Reni Sagayaraj, S. Anand Gnana Selvam, R. Reynald Susainathan

Abstract:

This study analyzed a queueing system with blocking and no waiting line. The customers arrive according to a Poisson process and the service times follow exponential distribution. There are two non-identical servers in the system. The queue discipline is FCFS, and the customers select the servers on fastest server first (FSF) basis. The service times are exponentially distributed with parameters μ1 and μ2 at servers I and II, respectively. Besides, the catastrophes occur in a Poisson manner with rate γ in the system. When server I is busy or blocked, the customer who arrives in the system leaves the system without being served. Such customers are called lost customers. The probability of losing a customer was computed for the system. The explicit time dependent probabilities of system size are obtained and a numerical example is presented in order to show the managerial insights of the model. Finally, the probability that arriving customer finds system busy and average number of server busy in steady state are obtained numerically.

Keywords: queueing system, blocking, poisson process, heterogeneous servers, queue discipline FCFS, busy period

Procedia PDF Downloads 499
5966 Role of Interlayer Coupling for the Power Factor of CuSbS2 and CuSbSe2

Authors: Najebah Alsaleh, Nirpendra Singh, Udo Schwingenschlogl

Abstract:

The electronic and transport properties of bulk and monolayer CuSbS2 and CuSbSe2 are determined by using density functional theory and semiclassical Boltzmann transport theory, in order to investigate the role of interlayer coupling for the thermoelectric properties. The calculated band gaps of the bulk compounds are in agreement with experiments and significantly higher than those of the monolayers, which thus show lower Seebeck coefficients. Since also the electrical conductivity is lower, the monolayers are characterized by lower power factors. Therefore, interlayer coupling is found to be essential for the excellent thermoelectric response of CuSbS2 and CuSbSe2, even though it is weak.

Keywords: density functional theory, thermoelectric, electronic properties, monolayer

Procedia PDF Downloads 319
5965 Grounded Theory of Consumer Loyalty: A Perspective through Video Game Addiction

Authors: Bassam Shaikh, R. S. A. Jumain

Abstract:

Game addiction has become an extremely important topic in psychology researchers, particularly in understanding and explaining why individuals become addicted (to video games). In previous studies, effect of online game addiction on social responsibilities, health problems, government action, and the behaviors of individuals to purchase and the causes of making individuals addicted on the video games has been discussed. Extending these concepts in marketing, it could be argued than the phenomenon could enlighten and extending our understanding on consumer loyalty. This study took the Grounded Theory approach, and found that motivation, satisfaction, fulfillments, exploration and achievements to be part of the important elements that builds consumer loyalty.

Keywords: grounded theory, consumer loyalty, video games, video game addiction

Procedia PDF Downloads 528
5964 Performance Optimization on Waiting Time Using Queuing Theory in an Advanced Manufacturing Environment: Robotics to Enhance Productivity

Authors: Ganiyat Soliu, Glen Bright, Chiemela Onunka

Abstract:

Performance optimization plays a key role in controlling the waiting time during manufacturing in an advanced manufacturing environment to improve productivity. Queuing mathematical modeling theory was used to examine the performance of the multi-stage production line. Robotics as a disruptive technology was implemented into a virtual manufacturing scenario during the packaging process to study the effect of waiting time on productivity. The queuing mathematical model was used to determine the optimum service rate required by robots during the packaging stage of manufacturing to yield an optimum production cost. Different rates of production were assumed in a virtual manufacturing environment, cost of packaging was estimated with optimum production cost. An equation was generated using queuing mathematical modeling theory and the theorem adopted for analysis of the scenario is the Newton Raphson theorem. Queuing theory presented here provides an adequate analysis of the number of robots required to regulate waiting time in order to increase the number of output. Arrival rate of the product was fast which shows that queuing mathematical model was effective in minimizing service cost and the waiting time during manufacturing. At a reduced waiting time, there was an improvement in the number of products obtained per hour. The overall productivity was improved based on the assumptions used in the queuing modeling theory implemented in the virtual manufacturing scenario.

Keywords: performance optimization, productivity, queuing theory, robotics

Procedia PDF Downloads 145
5963 Examining the Attitude and Behavior Towards Household Waste in China With the Theory of Planned Behavior and PEST Analysis

Authors: Yuxuan Liu, Jianli Hao, Ruoyu Zhang, Lin Lin, Nelsen Andreco Muljadi, Yu Song, Guobin Gong

Abstract:

With the increased municipal waste of China, household waste management (HWM) has become a key issue for sustainable development. In this study, an online survey questionnaire was conducted with the aim of assessing the current attitudes and behaviors of the households in China towards waste separationand recycling practices. Related influential factors are also determined within the context of the theory of planned behavior and PEST analysis. The survey received a total of 551 valid respondents. Results showed that the sample has an overall positive attitudes and behavior toward participating in HWM, but only 16.3% of themregularly segregate their waste. Society and policy are also found to be the two most impactful factors.

Keywords: householde waste management, theory of planned behavior, attitude, behavior

Procedia PDF Downloads 194
5962 Geometrically Non-Linear Axisymmetric Free Vibration Analysis of Functionally Graded Annular Plates

Authors: Boutahar Lhoucine, El Bikri Khalid, Benamar Rhali

Abstract:

In this paper, the non-linear free axisymmetric vibration of a thin annular plate made of functionally graded material (FGM) has been studied by using the energy method and a multimode approach. FGM properties vary continuously as well as non-homogeneity through the thickness direction of the plate. The theoretical model is based on the classical plate theory and the Von Kármán geometrical non-linearity assumptions. An approximation has been adopted in the present work consisting of neglecting the in-plane deformation in the formulation. Hamilton’s principle is used to derive the governing equation of motion. The problem is solved by a numerical iterative procedure in order to obtain more accurate results for vibration amplitudes up to 1.5 times the plate thickness. The numerical results are given for the first axisymmetric non-linear mode shape for a wide range of vibration amplitudes and they are presented either in tabular form or in graphical form to show the effect that the vibration amplitude and the variation in material properties have significant effects on the frequencies and the bending stresses in large amplitude vibration of the functionally graded annular plate.

Keywords: non-linear vibrations, annular plates, large amplitudes, functionally graded material

Procedia PDF Downloads 360
5961 An Experimental Investigation of the Cognitive Noise Influence on the Bistable Visual Perception

Authors: Alexander E. Hramov, Vadim V. Grubov, Alexey A. Koronovskii, Maria K. Kurovskaуa, Anastasija E. Runnova

Abstract:

The perception of visual signals in the brain was among the first issues discussed in terms of multistability which has been introduced to provide mechanisms for information processing in biological neural systems. In this work the influence of the cognitive noise on the visual perception of multistable pictures has been investigated. The study includes an experiment with the bistable Necker cube illusion and the theoretical background explaining the obtained experimental results. In our experiments Necker cubes with different wireframe contrast were demonstrated repeatedly to different people and the probability of the choice of one of the cubes projection was calculated for each picture. The Necker cube was placed at the middle of a computer screen as black lines on a white background. The contrast of the three middle lines centered in the left middle corner was used as one of the control parameter. Between two successive demonstrations of Necker cubes another picture was shown to distract attention and to make a perception of next Necker cube more independent from the previous one. Eleven subjects, male and female, of the ages 20 through 45 were studied. The choice of the Necker cube projection was detected with the Electroencephalograph-recorder Encephalan-EEGR-19/26, Medicom MTD. To treat the experimental results we carried out theoretical consideration using the simplest double-well potential model with the presence of noise that led to the Fokker-Planck equation for the probability density of the stochastic process. At the first time an analytical solution for the probability of the selection of one of the Necker cube projection for different values of wireframe contrast have been obtained. Furthermore, having used the results of the experimental measurements with the help of the method of least squares we have calculated the value of the parameter corresponding to the cognitive noise of the person being studied. The range of cognitive noise parameter values for studied subjects turned to be [0.08; 0.55]. It should be noted, that experimental results have a good reproducibility, the same person being studied repeatedly another day produces very similar data with very close levels of cognitive noise. We found an excellent agreement between analytically deduced probability and the results obtained in the experiment. A good qualitative agreement between theoretical and experimental results indicates that even such a simple model allows simulating brain cognitive dynamics and estimating important cognitive characteristic of the brain, such as brain noise.

Keywords: bistability, brain, noise, perception, stochastic processes

Procedia PDF Downloads 443
5960 Seismic Response Mitigation of Structures Using Base Isolation System Considering Uncertain Parameters

Authors: Rama Debbarma

Abstract:

The present study deals with the performance of Linear base isolation system to mitigate seismic response of structures characterized by random system parameters. This involves optimization of the tuning ratio and damping properties of the base isolation system considering uncertain system parameters. However, the efficiency of base isolator may reduce if it is not tuned to the vibrating mode it is designed to suppress due to unavoidable presence of system parameters uncertainty. With the aid of matrix perturbation theory and first order Taylor series expansion, the total probability concept is used to evaluate the unconditional response of the primary structures considering random system parameters. For this, the conditional second order information of the response quantities are obtained in random vibration framework using state space formulation. Subsequently, the maximum unconditional root mean square displacement of the primary structures is used as the objective function to obtain optimum damping parameters Numerical study is performed to elucidate the effect of parameters uncertainties on the optimization of parameters of linear base isolator and system performance.

Keywords: linear base isolator, earthquake, optimization, uncertain parameters

Procedia PDF Downloads 428
5959 Rule Based Architecture for Collaborative Multidisciplinary Aircraft Design Optimisation

Authors: Nickolay Jelev, Andy Keane, Carren Holden, András Sóbester

Abstract:

In aircraft design, the jump from the conceptual to preliminary design stage introduces a level of complexity which cannot be realistically handled by a single optimiser, be that a human (chief engineer) or an algorithm. The design process is often partitioned along disciplinary lines, with each discipline given a level of autonomy. This introduces a number of challenges including, but not limited to: coupling of design variables; coordinating disciplinary teams; handling of large amounts of analysis data; reaching an acceptable design within time constraints. A number of classical Multidisciplinary Design Optimisation (MDO) architectures exist in academia specifically designed to address these challenges. Their limited use in the industrial aircraft design process has inspired the authors of this paper to develop an alternative strategy based on well established ideas from Decision Support Systems. The proposed rule based architecture sacrifices possibly elusive guarantees of convergence for an attractive return in simplicity. The method is demonstrated on analytical and aircraft design test cases and its performance is compared to a number of classical distributed MDO architectures.

Keywords: Multidisciplinary Design Optimisation, Rule Based Architecture, Aircraft Design, Decision Support System

Procedia PDF Downloads 350
5958 Testing the Life Cycle Theory on the Capital Structure Dynamics of Trade-Off and Pecking Order Theories: A Case of Retail, Industrial and Mining Sectors

Authors: Freddy Munzhelele

Abstract:

Setting: the empirical research has shown that the life cycle theory has an impact on the firms’ financing decisions, particularly the dividend pay-outs. Accordingly, the life cycle theory posits that as a firm matures, it gets to a level and capacity where it distributes more cash as dividends. On the other hand, the young firms prioritise investment opportunities sets and their financing; thus, they pay little or no dividends. The research on firms’ financing decisions also demonstrated, among others, the adoption of trade-off and pecking order theories on the dynamics of firms capital structure. The trade-off theory talks to firms holding a favourable position regarding debt structures particularly as to the cost and benefits thereof; and pecking order is concerned with firms preferring a hierarchical order as to choosing financing sources. The case of life cycle hypothesis explaining the financial managers’ decisions as regards the firms’ capital structure dynamics appears to be an interesting link, yet this link has been neglected in corporate finance research. If this link is to be explored as an empirical research, the financial decision-making alternatives will be enhanced immensely, since no conclusive evidence has been found yet as to the dynamics of capital structure. Aim: the aim of this study is to examine the impact of life cycle theory on the capital structure dynamics trade-off and pecking order theories of firms listed in retail, industrial and mining sectors of the JSE. These sectors are among the key contributors to the GDP in the South African economy. Design and methodology: following the postpositivist research paradigm, the study is quantitative in nature and utilises secondary data obtainable from the financial statements of sampled firm for the period 2010 – 2022. The firms’ financial statements will be extracted from the IRESS database. Since the data will be in panel form, a combination of the static and dynamic panel data estimators will used to analyse data. The overall data analyses will be done using STATA program. Value add: this study directly investigates the link between the life cycle theory and the dynamics of capital structure decisions, particularly the trade-off and pecking order theories.

Keywords: life cycle theory, trade-off theory, pecking order theory, capital structure, JSE listed firms

Procedia PDF Downloads 59
5957 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics

Authors: Mikheil Kalmakhelidze

Abstract:

Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.

Keywords: description logic, fuzzy logic, neural networks, record linkage

Procedia PDF Downloads 271
5956 An Encapsulation of a Navigable Tree Position: Theory, Specification, and Verification

Authors: Nicodemus M. J. Mbwambo, Yu-Shan Sun, Murali Sitaraman, Joan Krone

Abstract:

This paper presents a generic data abstraction that captures a navigable tree position. The mathematical modeling of the abstraction encapsulates the current tree position, which can be used to navigate and modify the tree. The encapsulation of the tree position in the data abstraction specification avoids the use of explicit references and aliasing, thereby simplifying verification of (imperative) client code that uses the data abstraction. To ease the tasks of such specification and verification, a general tree theory, rich with mathematical notations and results, has been developed. The paper contains an example to illustrate automated verification ramifications. With sufficient tree theory development, automated proving seems plausible even in the absence of a special-purpose tree solver.

Keywords: automation, data abstraction, maps, specification, tree, verification

Procedia PDF Downloads 161
5955 Employers’ Preferences when Employing Solo Self-employed: a Vignette Study in the Netherlands

Authors: Lian Kösters, Wendy Smits, Raymond Montizaan

Abstract:

The number of solo self-employed in the Netherlands has been increasing for years. The relative increase is among the largest in the EU. To explain this increase, most studies have focused on the supply side, workers who offer themselves as solo self-employed. The number of studies that focus on the demand side, the employer who hires the solo self-employed, is still scarce. Studies into employer behaviour conducted until now show that employers mainly choose self-employed workers when they have a temporary need for specialist knowledge, but also during projects or production peaks. These studies do not provide insight into the employers’ considerations for different contract types. In this study, interviews with employers were conducted, and available literature was consulted to provide an overview of the several factors employers use to compare different contract types. That input was used to set up a vignette study. This was carried out at the end of 2021 among almost 1000 business owners, HR managers, and business leaders of Dutch companies. Each respondent was given two sets of five fictitious candidates for two possible positions in their organization. They were asked to rank these candidates. The positions varied with regard to the type of tasks (core tasks or support tasks) and the time it took to train new people for the position. The respondents were asked additional questions about the positions, such as the required level of education, the duration, and the degree of predictability of tasks. The fictitious candidates varied, among other things, in the type of contract on which they would come to work for the organization. The results were analyzed using a rank-ordered logit analysis. This vignette setup makes it possible to see which factors are most important for employers when choosing to hire a solo self-employed person compared to other contracts. The results show that there are no indications that employers would want to hire solo self-employed workers en masse. They prefer regular employee contracts. The probability of being chosen with a solo self-employed contract over someone who comes to work as a temporary employee is 32 percent. This probability is even lower than for on-call and temporary agency workers. For a permanent contract, this probability is 46 percent. The results provide indications that employers consider knowledge and skills more important than the solo self-employed contract and that this can compensate. A solo self-employed candidate with 10 years of work experience has a 63 percent probability of being found attractive by an employer compared to a temporary employee without work experience. This suggests that employers are willing to give someone a less attractive contract for the employer if the worker so wishes. The results also show that the probability that a solo self-employed person is preferred over a candidate with a temporary employee contract is somewhat higher in business economics, administrative and technical professions. No significant results were found for factors where it was expected that solo self-employed workers are preferred more often, such as for unpredictable or temporary work.

Keywords: employer behaviour, rank-ordered logit analysis, solo self-employment, temporary contract, vignette study

Procedia PDF Downloads 70