Search results for: probability theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2086

Search results for: probability theory

1876 Conspiracy Theory in Discussions of the Coronavirus Pandemic in the Gulf Region

Authors: Rasha Salameh

Abstract:

In light of the tense relationship between Saudi Arabia and Iran, this research paper sheds some light on Saudi-owned television network, Al-Arabiya’s reporting of the Coronavirus in the Gulf region. Particularly because most of the cases in the beginning were coming from Iran, some programs of this Saudi channel embraced a conspiracy theory. Hate speech has been used in the talking and discussions about the topic. The results of these discussions will be detailed in this paper in percentages with regard to the research sample, which includes five programs on the Al-Arabiya channel: ‘DNA’, ‘Marraya’ (Mirrors), ‘Panorama’, ‘Tafaolcom’ (Your Interaction) and ‘Diplomatic Street’, in the period between January 19, that is, the date of the first case in Iran, and April 10, 2020. The research shows the use of a conspiracy theory in the programs, in addition to some professional violations. The surveyed sample also shows that the matter receded due to the Arab Gulf states' preoccupation with the successively increasing cases that have appeared there since the start of the pandemic. The results indicate that hate speech was present in the sample at a rate of 98.1%, and that most of the programs that dealt with the Iranian issue under the Coronavirus pandemic on Al Arabiya used the conspiracy theory at a rate of 75.5%.

Keywords: Al-Arabiya, Iran, COVID-19, hate speech, conspiracy theory, politicization of the pandemic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 407
1875 Dependency Theory on Examining the Relationship between the United States and the Middle East: In the Case of Iran, Saudi Arabia, and Turkey

Authors: Abdelhafez Abdel Hafez

Abstract:

Dependency theory was developed since 1950s, with economic concerns. It divided the world into two parts, the states of the peripheral (third world countries) and the states of the core (the developed capitalist countries). Another perspective developed to the theory with the implementation of the idea of semi-peripheral states in the new world order. With these divisions (core, peripheral, semi-peripheral) this study aims to develop a concept from the perspective of dependency theory, to understand the nature of the relationship of the U.S. with the Middle East Regions through its relation with Iran, Saudi Arabia, and Turkey. The tested countries (Saudi Arabia, Iran and Turkey) are seeking a foothold and influential role in the region. The paper argued that the U.S. directs its policies toward the region, in the way to guarantee no country of the region will be in semi-peripheral level (that could create competitions or danger on the U.S. interest). Therefore, U.S. policies in the region have varied from declaring war to diplomatic channels and sometimes ignoring. The paper is based on the dependency theory, and other international relations theories used to study the Middle East in the international context.

Keywords: Dependency, hegemony, imperialism, Middle East.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 489
1874 Gravitational Frequency Shifts for Photons and Particles

Authors: Jing-Gang Xie

Abstract:

The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.

Keywords: General relativity theory, particles, photons, quantum gravity model, gravitational frequency shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2179
1873 The Problems of Employment Form Selection of Capital Group Management Team Members in the Light of Chosen Company Management Theories

Authors: D. Bąk-Grabowska, A. Jagoda

Abstract:

Managing a capital group is a complex and specific process. It creates special conditions for the introduction of team work organization of managers. The selection of a manager employment form is a problem which gets complicated in case of management teams. The considered possibilities are an employment-based and non-employment managerial contract, which can be based on a thorough action or on formulating definite expectations regarding the results of a manager’s work. The problem of selection between individual and collegiate settlement of managers’ work has been pointed out. The deliberations were based on the assumptions of chosen company management theories, including transactional cost, agency theory, nexus of contracts theory, stewardship theory and theories referring directly to management teams, i.e. Upper echelons theory

Keywords: Capital group, employment forms, management teams, managers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1369
1872 Applying Bowen’s Theory to Intern Supervision

Authors: Jeff A. Tysinger, Dawn P. Tysinger

Abstract:

The aim of this paper is to theoretically apply Bowen’s understanding of triangulation and triads to school psychology intern supervision so that it can assist in the conceptualization of the dynamics of intern supervision and provide some key methods to address common issues. The school psychology internship is the capstone experience for the school psychologist in training. It involves three key participants whose relationships will determine the success of the internship.  To understand the potential effect, Bowen’s family systems theory can be applied to the supervision relationship. He describes a way to resolve stress between two people by triangulating or binging in a third person. He applies this to a nuclear family, but school psychology intern supervision requires the marriage of an intern, field supervisor, and university supervisor; thus, setting all up for possible triangulation. The consequences of triangulation can apply to standards and requirements, direct supervision, and intern evaluation. Strategies from family systems theory to decrease the negative impact of supervision triangulation.

Keywords: Family systems theory, intern supervision, triangulation, school psychology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 889
1871 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1932
1870 Modulation Identification Algorithm for Adaptive Demodulator in Software Defined Radios Using Wavelet Transform

Authors: P. Prakasam, M. Madheswaran

Abstract:

A generalized Digital Modulation Identification algorithm for adaptive demodulator has been developed and presented in this paper. The algorithm developed is verified using wavelet Transform and histogram computation to identify QPSK and QAM with GMSK and M–ary FSK modulations. It has been found that the histogram peaks simplifies the procedure for identification. The simulated results show that the correct modulation identification is possible to a lower bound of 5 dB and 12 dB for GMSK and QPSK respectively. When SNR is above 5 dB the throughput of the proposed algorithm is more than 97.8%. The receiver operating characteristics (ROC) has been computed to measure the performance of the proposed algorithm and the analysis shows that the probability of detection (Pd) drops rapidly when SNR is 5 dB and probability of false alarm (Pf) is smaller than 0.3. The performance of the proposed algorithm has been compared with existing methods and found it will identify all digital modulation schemes with low SNR.

Keywords: Bit Error rate, Receiver Operating Characteristics, Software Defined Radio, Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2379
1869 Belief Theory-Based Classifiers Comparison for Static Human Body Postures Recognition in Video

Authors: V. Girondel, L. Bonnaud, A. Caplier, M. Rombaut

Abstract:

This paper presents various classifiers results from a system that can automatically recognize four different static human body postures in video sequences. The considered postures are standing, sitting, squatting, and lying. The three classifiers considered are a naïve one and two based on the belief theory. The belief theory-based classifiers use either a classic or restricted plausibility criterion to make a decision after data fusion. The data come from the people 2D segmentation and from their face localization. Measurements consist in distances relative to a reference posture. The efficiency and the limits of the different classifiers on the recognition system are highlighted thanks to the analysis of a great number of results. This system allows real-time processing.

Keywords: Belief theory, classifiers comparison, data fusion, human motion analysis, real-time processing, static posture recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
1868 Reliability Assessment of Bangladesh Power System Using Recursive Algorithm

Authors: Nahid-Al-Masood, Jubaer Ahmed, Amina Hasan Abedin, S. R. Deeba, Faeza Hafiz, Mahmuda Begum

Abstract:

An electric utility-s main concern is to plan, design, operate and maintain its power supply to provide an acceptable level of reliability to its users. This clearly requires that standards of reliability be specified and used in all three sectors of the power system, i.e., generation, transmission and distribution. That is why reliability of a power system is always a major concern to power system planners. This paper presents the reliability analysis of Bangladesh Power System (BPS). Reliability index, loss of load probability (LOLP) of BPS is evaluated using recursive algorithm and considering no de-rated states of generators. BPS has sixty one generators and a total installed capacity of 5275 MW. The maximum demand of BPS is about 5000 MW. The relevant data of the generators and hourly load profiles are collected from the National Load Dispatch Center (NLDC) of Bangladesh and reliability index 'LOLP' is assessed for the period of last ten years.

Keywords: Recursive algorithm, LOLP, forced outage rate, cumulative probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2309
1867 Real Time Speed Estimation of Vehicles

Authors: Azhar Hussain, Kashif Shahzad, Chunming Tang

Abstract:

this paper gives a novel approach towards real-time speed estimation of multiple traffic vehicles using fuzzy logic and image processing techniques with proper arrangement of camera parameters. The described algorithm consists of several important steps. First, the background is estimated by computing median over time window of specific frames. Second, the foreground is extracted using fuzzy similarity approach (FSA) between estimated background pixels and the current frame pixels containing foreground and background. Third, the traffic lanes are divided into two parts for both direction vehicles for parallel processing. Finally, the speeds of vehicles are estimated by Maximum a Posterior Probability (MAP) estimator. True ground speed is determined by utilizing infrared sensors for three different vehicles and the results are compared to the proposed algorithm with an accuracy of ± 0.74 kmph.

Keywords: Defuzzification, Fuzzy similarity approach, lane cropping, Maximum a Posterior Probability (MAP) estimator, Speed estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2760
1866 Application of Granular Computing Paradigm in Knowledge Induction

Authors: Iftikhar U. Sikder

Abstract:

This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.

Keywords: Concept approximation, granular computing, reducts, rough set theory, rule induction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 777
1865 Analytical Study and Modeling of Free Vibrations of Functionally Graded Plates Using a Higher Shear Deformation Theory

Authors: A. Meftah, D. Zarga, M. Yahiaoui

Abstract:

In this paper, we have used an analytical method to analyze the vibratory behavior of plates in materials with gradient of properties, simply supported, proposing a refined non polynomial theory. The number of unknown functions involved in this theory is only four, as compared to five in the case of other higher shear deformation theories. The transverse shearing effects are studied according to the thickness of the plate. The motion equations for the FGM plates are obtained by the Hamilton principle application, the solutions are obtained using the Navier method, and then the fundamental frequencies are found, solving an eigenvalue equation system, the results of this analysis are presented and compared to those available in the literature.

Keywords: FGM plates, Navier method, vibratory behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 601
1864 Conflation Methodology Applied to Flood Recovery

Authors: E. L. Suarez, D. E. Meeroff, Y. Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: Community resilience, conflation, flood risk, nuisance flooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42
1863 Dempster-Shafer Evidence Theory for Image Segmentation: Application in Cells Images

Authors: S. Ben Chaabane, M. Sayadi, F. Fnaiech, E. Brassart

Abstract:

In this paper we propose a new knowledge model using the Dempster-Shafer-s evidence theory for image segmentation and fusion. The proposed method is composed essentially of two steps. First, mass distributions in Dempster-Shafer theory are obtained from the membership degrees of each pixel covering the three image components (R, G and B). Each membership-s degree is determined by applying Fuzzy C-Means (FCM) clustering to the gray levels of the three images. Second, the fusion process consists in defining three discernment frames which are associated with the three images to be fused, and then combining them to form a new frame of discernment. The strategy used to define mass distributions in the combined framework is discussed in detail. The proposed fusion method is illustrated in the context of image segmentation. Experimental investigations and comparative studies with the other previous methods are carried out showing thus the robustness and superiority of the proposed method in terms of image segmentation.

Keywords: Fuzzy C-means, Color image, data fusion, Dempster-Shafer's evidence theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2156
1862 Continuous Wave Interference Effects on Global Position System Signal Quality

Authors: Fang Ye, Han Yu, Yibing Li

Abstract:

Radio interference is one of the major concerns in using the global positioning system (GPS) for civilian and military applications. Interference signals are produced not only through all electronic systems but also illegal jammers. Among different types of interferences, continuous wave (CW) interference has strong adverse impacts on the quality of the received signal. In this paper, we make more detailed analysis for CW interference effects on GPS signal quality. Based on the C/A code spectrum lines, the influence of CW interference on the acquisition performance of GPS receivers is further analysed. This influence is supported by simulation results using GPS software receiver. As the most important user parameter of GPS receivers, the mathematical expression of bit error probability is also derived in the presence of CW interference, and the expression is consistent with the Monte Carlo simulation results. The research on CW interference provides some theoretical gist and new thoughts on monitoring the radio noise environment and improving the anti-jamming ability of GPS receivers.

Keywords: GPS, CW interference, acquisition performance, bit error probability, Monte Carlo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
1861 Performance Analysis of a Dynamic Channel Reservation-Like Technique for Low Earth Orbit Mobile Satellite Systems

Authors: W. Kiamouche, S. Lasmari, M. Benslama

Abstract:

In order to derive important parameters concerning mobile subscriber MS with ongoing calls in Low Earth Orbit Mobile Satellite Systems LEO MSSs, a positioning system had to be integrated into MSS in order to localize mobile subscribers MSs and track them during the connection. Such integration is regarded as a complex implementation. We propose in this paper a novel method based on advantages of mobility model of Low Earth Orbit Mobile Satellite System LEO MSS which allows the evaluation of instant of subsequent handover of a MS even if its location is unknown. This method is utilized to propose a Dynamic Channel Reservation DCRlike scheme based on the DCR scheme previously proposed in literature. Results presented show that DCR-like technique gives different QoS performance than DCR. Indeed, an improve in handover blocking probability and an increase in new call blocking probability are observed for the DCR-like technique.

Keywords: cellular layout, DCR, LEO mobile satellite system, mobility model, positioning system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
1860 Internet Shopping: A Study Based On Hedonic Value and Flow Theory

Authors: Pui-Lai To, E-Ping Sung

Abstract:

With the flourishing development of online shopping, an increasing number of customers see online shopping as an entertaining experience. Because the online consumer has a double identity as a shopper and an Internet user, online shopping should offer hedonic values of shopping and Internet usage. The purpose of this study is to investigate hedonic online shopping motivations from the perspectives of traditional hedonic value and flow theory. The study adopted a focus group interview method, including two online and two offline interviews. Four focus groups of shoppers consisted of online professionals, online college students, offline professionals and offline college students. The results of the study indicate that traditional hedonic values and dimensions of flow theory exist in the online shopping environment. The study indicated that online shoppers seem to appreciate being able to learn things and grow to become competitive achievers online. Comparisons of online hedonic motivations between groups are conducted. This study serves as a basis for the future growth of Internet marketing.

Keywords: Flow theory, hedonic motivation, internet shopping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3151
1859 Nonconforming Control Charts for Zero-Inflated Poisson Distribution

Authors: N. Katemee, T. Mayureesawan

Abstract:

This paper developed the c-Chart based on a Zero- Inflated Poisson (ZIP) processes that approximated by a geometric distribution with parameter p. The p estimated that fit for ZIP distribution used in calculated the mean, median, and variance of geometric distribution for constructed the c-Chart by three difference methods. For cg-Chart, developed c-Chart by used the mean and variance of the geometric distribution constructed control limits. For cmg-Chart, the mean used for constructed the control limits. The cme- Chart, developed control limits of c-Chart from median and variance values of geometric distribution. The performance of charts considered from the Average Run Length and Average Coverage Probability. We found that for an in-control process, the cg-Chart is superior for low level of mean at all level of proportion zero. For an out-of-control process, the cmg-Chart and cme-Chart are the best for mean = 2, 3 and 4 at all level of parameter.

Keywords: average coverage probability, average run length, geometric distribution, zero-inflated poisson distribution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2356
1858 Mobile Robot Path Planning Utilizing Probability Recursive Function

Authors: Ethar H. Khalil, Bahaa I. Kazem

Abstract:

In this work a software simulation model has been proposed for two driven wheels mobile robot path planning; that can navigate in dynamic environment with static distributed obstacles. The work involves utilizing Bezier curve method in a proposed N order matrix form; for engineering the mobile robot path. The Bezier curve drawbacks in this field have been diagnosed. Two directions: Up and Right function has been proposed; Probability Recursive Function (PRF) to overcome those drawbacks. PRF functionality has been developed through a proposed; obstacle detection function, optimization function which has the capability of prediction the optimum path without comparison between all feasible paths, and N order Bezier curve function that ensures the drawing of the obtained path. The simulation results that have been taken showed; the mobile robot travels successfully from starting point and reaching its goal point. All obstacles that are located in its way have been avoided. This navigation is being done successfully using the proposed PRF techniques.

Keywords: Mobile robot, path planning, Bezier curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
1857 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: Reliability approach, storage tanks, Monte Carlo simulation, seismic acceleration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431
1856 The Applications of Four Fingers Theory: The Proof of 66 Acupoints under the Human Elbow and Knee

Authors: Chih-I. Tsai, Yu-Chien. Lin

Abstract:

Through experiences of clinical practices, it is discovered that locations on the body at a level of four fingerbreadth above and below the joints are the points at which muscles connect to tendons, and since the muscles and tendons possess opposite characteristics, muscles are full of blood but lack qi, while tendons are full of qi but lack blood, these points on our body become easily blocked. It is proposed that through doing acupuncture or creating localized pressure to the areas four fingerbreadths above and below our joints, with an elastic bandage, we could help the energy, also known as qi, to flow smoothly in our body and further improve our health. Based on the Four Fingers Theory, we understand that human height is 22 four fingerbreadths. In addition, qi and blood travel through 24 meridians, 50 times each day, and they flow through 6 cun with every human breath. We can also understand the average number of human heartbeats is 75 times per minute. And the function of qi-blood circulation system in Traditional Chinese Medicine is the same as the blood circulation in Western Medical Science. Informed by Four Fingers Theory, this study further examined its applications in acupuncture practices. The research question is how Four Fingers Theory proves what has been mentioned in Nei Jing that there are 66 acupoints under a human’s elbow and knee. In responding to the research question, there are 66 acupoints under a human’s elbow and knee. Four Fingers Theory facilitated the creation of the acupuncture naming and teaching system. It is expected to serve as an approachable and effective way to deliver knowledge of acupuncture to the public worldwide.

Keywords: Four Fingers theory, Meridians circulation, 66 Acupoints under a human’s elbow and knee, acupuncture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
1855 Danger Theory and Intelligent Data Processing

Authors: Anjum Iqbal, Mohd Aizaini Maarof

Abstract:

Artificial Immune System (AIS) is relatively naive paradigm for intelligent computations. The inspiration for AIS is derived from natural Immune System (IS). Classically it is believed that IS strives to discriminate between self and non-self. Most of the existing AIS research is based on this approach. Danger Theory (DT) argues this approach and proposes that IS fights against danger producing elements and tolerates others. We, the computational researchers, are not concerned with the arguments among immunologists but try to extract from it novel abstractions for intelligent computation. This paper aims to follow DT inspiration for intelligent data processing. The approach may introduce new avenue in intelligent processing. The data used is system calls data that is potentially significant in intrusion detection applications.

Keywords: artificial immune system, danger theory, intelligent processing, system calls

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838
1854 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage

Authors: Oh Hyeon Jeon, WooYoung Jung

Abstract:

In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.

Keywords: Weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo Simulation, permeability coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1099
1853 Structural Study of Boron - Nitride Nanotube with Magnetic Resonance (NMR) Parameters Calculation via Density Functional Theory Method (DFT)

Authors: Asadollah Boshra, Ahmad Seif, Mehran Aghaei

Abstract:

A model of (4, 4) single-walled boron-nitride nanotube as a representative of armchair boron-nitride nanotubes studied. At first the structure optimization performed and then Nuclear Magnetic Resonance parameters (NMR) by Density Functional Theory (DFT) method at 11B and 15N nuclei calculated. Resulted parameters evaluation presents electrostatic environment heterogeneity along the nanotube and especially at the ends but the nuclei in a layer feel the same electrostatic environment. All of calculations carried out using Gaussian 98 Software package.

Keywords: Boron-nitride nanotube, Density Functional Theory, Nuclear Magnetic Resonance (NMR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1883
1852 Embodied Cognition and Its Implications in Education: An Overview of Recent Literature

Authors: Panagiotis Kosmas, Panayiotis Zaphiris

Abstract:

Embodied Cognition (EC) as a learning paradigm is based on the idea of an inseparable link between body, mind, and environment. In recent years, the advent of theoretical learning approaches around EC theory has resulted in a number of empirical studies exploring the implementation of the theory in education. This systematic literature overview identifies the mainstream of EC research and emphasizes on the implementation of the theory across learning environments. Based on a corpus of 43 manuscripts, published between 2013 and 2017, it sets out to describe the range of topics covered under the umbrella of EC and provides a holistic view of the field. The aim of the present review is to investigate the main issues in EC research related to the various learning contexts. Particularly, the study addresses the research methods and technologies that are utilized, and it also explores the integration of body into the learning context. An important finding from the overview is the potential of the theory in different educational environments and disciplines. However, there is a lack of an explicit pedagogical framework from an educational perspective for a successful implementation in various learning contexts.

Keywords: Embodied cognition, embodied learning, education, technology, schools.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
1851 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity

Authors: Mujtaba Roshan, John A. Schormans

Abstract:

Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.

Keywords: Quality of experience, quality of service, packet loss probability, network capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 895
1850 Efficient Detection Using Sequential Probability Ratio Test in Mobile Cognitive Radio Systems

Authors: Yeon-Jea Cho, Sang-Uk Park, Won-Chul Choi, Dong-Jo Park

Abstract:

This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.

Keywords: Cognitive radio, fast fading, sequential detection, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1703
1849 Net-Banking System as a Game

Authors: N. Ghoualmi-Zine, A. Araar

Abstract:

In this article we propose to model Net-banking system by game theory. We adopt extensive game to model our web application. We present the model in term of players and strategy. We present UML diagram related the protocol game.

Keywords: Game theory, model, state, web application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509
1848 Non-equilibrium Statistical Mechanics of a Driven Lattice Gas Model: Probability Function, FDT-violation, and Monte Carlo Simulations

Authors: K. Sudprasert, M. Precharattana, N. Nuttavut, D. Triampo, B. Pattanasiri, Y. Lenbury, W. Triampo

Abstract:

The study of non-equilibrium systems has attracted increasing interest in recent years, mainly due to the lack of theoretical frameworks, unlike their equilibrium counterparts. Studying the steady state and/or simple systems is thus one of the main interests. Hence in this work we have focused our attention on the driven lattice gas model (DLG model) consisting of interacting particles subject to an external field E. The dynamics of the system are given by hopping of particles to nearby empty sites with rates biased for jumps in the direction of E. Having used small two dimensional systems of DLG model, the stochastic properties at nonequilibrium steady state were analytically studied. To understand the non-equilibrium phenomena, we have applied the analytic approach via master equation to calculate probability function and analyze violation of detailed balance in term of the fluctuation-dissipation theorem. Monte Carlo simulations have been performed to validate the analytic results.

Keywords: Non-equilibrium, lattice gas, stochastic process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1687
1847 The Story of Mergers and Acquisitions: Using Narrative Theory to Understand the Uncertainty of Organizational Change

Authors: Philip T. Roundy

Abstract:

This paper examines the influence of communication form on employee uncertainty during mergers and acquisitions (M&As). Specifically, the author uses narrative theory to analyze how narrative organizational communication affects the three components of uncertainty – decreased predictive, explanatory, and descriptive ability. It is hypothesized that employees whose organizations use narrative M&A communication will have greater predictive, explanatory, and descriptive abilities than employees of organizations using non-narrative M&A communication. This paper contributes to the stream of research examining uncertainty during mergers and acquisitions and argues that narratives are an effective means of managing uncertainty in the mergers and acquisitions context.

Keywords: Narrative Theory, Mergers and Acquisitions, Employee Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2970