Search results for: first passage probability theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5901

Search results for: first passage probability theory

5541 S-N-Pf Relationship for Steel Fibre Reinforced Concrete Made with Cement Additives

Authors: Gurbir Kaur, Surinder Pal Singh

Abstract:

The present study is a part of the research work on the effect of limestone powder (LP), silica fume (SF) and metakaolin (MK), on the flexural fatigue performance of steel fibre reinforced concrete (SFRC). Corrugated rectangular steel fibres of size 0.6x2.0x35 mm at a constant volume fraction of 1.0% have been incorporated in all mix combinations as the reinforcing material. Three mix combinations were prepared by replacing 30% of ordinary Portland cement (OPC) by weight with these cement additives in binary and ternary fashion to demonstrate their contribution. An experimental programme was conducted to obtain the fatigue lives of all mix combinations at various stress levels. The fatigue life data have been analysed as an attempt to determine the relationship between stress level ‘S’, number of cycles to failure ‘N’ and probability of failure ‘Pf’ for all mix combinations. The experimental coefficients of the fatigue equation have also been obtained from the fatigue data to represent the S-N-Pf curves analytically.

Keywords: cement additives, fatigue life, probability of failure, steel fibre reinforced concrete

Procedia PDF Downloads 411
5540 Diagnostic Yield of CT PA and Value of Pre Test Assessments in Predicting the Probability of Pulmonary Embolism

Authors: Shanza Akram, Sameen Toor, Heba Harb Abu Alkass, Zainab Abdulsalam Altaha, Sara Taha Abdulla, Saleem Imran

Abstract:

Acute pulmonary embolism (PE) is a common disease and can be fatal. The clinical presentation is variable and nonspecific, making accurate diagnosis difficult. Testing patients with suspected acute PE has increased dramatically. However, the overuse of some tests, particularly CT and D-dimer measurement, may not improve care while potentially leading to patient harm and unnecessary expense. CTPA is the investigation of choice for PE. Its easy availability, accuracy and ability to provide alternative diagnosis has lowered the threshold for performing it, resulting in its overuse. Guidelines have recommended the use of clinical pretest probability tools such as ‘Wells score’ to assess risk of suspected PE. Unfortunately, implementation of guidelines in clinical practice is inconsistent. This has led to low risk patients being subjected to unnecessary imaging, exposure to radiation and possible contrast related complications. Aim: To study the diagnostic yield of CT PA, clinical pretest probability of patients according to wells score and to determine whether or not there was an overuse of CTPA in our service. Methods: CT scans done on patients with suspected P.E in our hospital from 1st January 2014 to 31st December 2014 were retrospectively reviewed. Medical records were reviewed to study demographics, clinical presentation, final diagnosis, and to establish if Wells score and D-Dimer were used correctly in predicting the probability of PE and the need for subsequent CTPA. Results: 100 patients (51male) underwent CT PA in the time period. Mean age was 57 years (24-91 years). Majority of patients presented with shortness of breath (52%). Other presenting symptoms included chest pain 34%, palpitations 6%, collapse 5% and haemoptysis 5%. D Dimer test was done in 69%. Overall Wells score was low (<2) in 28 %, moderate (>2 - < 6) in 47% and high (> 6) in 15% of patients. Wells score was documented in medical notes of only 20% patients. PE was confirmed in 12% (8 male) patients. 4 had bilateral PE’s. In high-risk group (Wells > 6) (n=15), there were 5 diagnosed PEs. In moderate risk group (Wells >2 - < 6) (n=47), there were 6 and in low risk group (Wells <2) (n=28), one case of PE was confirmed. CT scans negative for PE showed pleural effusion in 30, Consolidation in 20, atelactasis in 15 and pulmonary nodule in 4 patients. 31 scans were completely normal. Conclusion: Yield of CT for pulmonary embolism was low in our cohort at 12%. A significant number of our patients who underwent CT PA had low Wells score. This suggests that CT PA is over utilized in our institution. Wells score was poorly documented in medical notes. CT-PA was able to detect alternative pulmonary abnormalities explaining the patient's clinical presentation. CT-PA requires concomitant pretest clinical probability assessment to be an effective diagnostic tool for confirming or excluding PE. . Clinicians should use validated clinical prediction rules to estimate pretest probability in patients in whom acute PE is being considered. Combining Wells scores with clinical and laboratory assessment may reduce the need for CTPA.

Keywords: CT PA, D dimer, pulmonary embolism, wells score

Procedia PDF Downloads 225
5539 Probability Fuzzy Aggregation Operators in Vehicle Routing Problem

Authors: Anna Sikharulidze, Gia Sirbiladze

Abstract:

For the evaluation of unreliability levels of movement on the closed routes in the vehicle routing problem, the fuzzy operators family is constructed. The interactions between routing factors in extreme conditions on the roads are considered. A multi-criteria decision-making model (MCDM) is constructed. Constructed aggregations are based on the Choquet integral and the associated probability class of a fuzzy measure. Propositions on the correctness of the extension are proved. Connections between the operators and the compositions of dual triangular norms are described. The conjugate connections between the constructed operators are shown. Operators reflect interactions among all the combinations of the factors in the fuzzy MCDM process. Several variants of constructed operators are used in the decision-making problem regarding the assessment of unreliability and possibility levels of movement on closed routes.

Keywords: vehicle routing problem, associated probabilities of a fuzzy measure, choquet integral, fuzzy aggregation operator

Procedia PDF Downloads 322
5538 Downtime Modelling for the Post-Earthquake Building Assessment Phase

Authors: S. Khakurel, R. P. Dhakal, T. Z. Yeow

Abstract:

Downtime is one of the major sources (alongside damage and injury/death) of financial loss incurred by a structure in an earthquake. The length of downtime associated with a building after an earthquake varies depending on the time taken for the reaction (to the earthquake), decision (on the future course of action) and execution (of the decided course of action) phases. Post-earthquake assessment of buildings is a key step in the decision making process to decide the appropriate safety placarding as well as to decide whether a damaged building is to be repaired or demolished. The aim of the present study is to develop a model to quantify downtime associated with the post-earthquake building-assessment phase in terms of two parameters; i) duration of the different assessment phase; and ii) probability of different colour tagging. Post-earthquake assessment of buildings includes three stages; Level 1 Rapid Assessment including a fast external inspection shortly after the earthquake, Level 2 Rapid Assessment including a visit inside the building and Detailed Engineering Evaluation (if needed). In this study, the durations of all three assessment phases are first estimated from the total number of damaged buildings, total number of available engineers and the average time needed for assessing each building. Then, probability of different tag colours is computed from the 2010-11 Canterbury earthquake Sequence database. Finally, a downtime model for the post-earthquake building inspection phase is proposed based on the estimated phase length and probability of tag colours. This model is expected to be used for rapid estimation of seismic downtime within the Loss Optimisation Seismic Design (LOSD) framework.

Keywords: assessment, downtime, LOSD, Loss Optimisation Seismic Design, phase length, tag color

Procedia PDF Downloads 177
5537 Trajectories of Conduct Problems and Cumulative Risk from Early Childhood to Adolescence

Authors: Leslie M. Gutman

Abstract:

Conduct problems (CP) represent a major dilemma, with wide-ranging and long-lasting individual and societal impacts. Children experience heterogeneous patterns of conduct problems; based on the age of onset, developmental course and related risk factors from around age 3. Early childhood represents a potential window for intervention efforts aimed at changing the trajectory of early starting conduct problems. Using the UK Millennium Cohort Study (n = 17,206 children), this study (a) identifies trajectories of conduct problems from ages 3 to 14 years and (b) assesses the cumulative and interactive effects of individual, family and socioeconomic risk factors from ages 9 months to 14 years. The same factors according to three domains were assessed, including child (i.e., low verbal ability, hyperactivity/inattention, peer problems, emotional problems), family (i.e., single families, parental poor physical and mental health, large family size) and socioeconomic (i.e., low family income, low parental education, unemployment, social housing). A cumulative risk score for the child, family, and socioeconomic domains at each age was calculated. It was then examined how the cumulative risk scores explain variation in the trajectories of conduct problems. Lastly, interactive effects among the different domains of cumulative risk were tested. Using group-based trajectory modeling, four distinct trajectories were found including a ‘low’ problem group and three groups showing childhood-onset conduct problems: ‘school-age onset’; ‘early-onset, desisting’; and ‘early-onset, persisting’. The ‘low’ group (57% of the sample) showed a low probability of conducts problems, close to zero, from 3 to 14 years. The ‘early-onset, desisting’ group (23% of the sample) demonstrated a moderate probability of CP in early childhood, with a decline from 3 to 5 years and a low probability thereafter. The ‘early-onset, persistent’ group (8%) followed a high probability of conduct problems, which declined from 11 years but was close to 70% at 14 years. In the ‘school-age onset’ group, 12% of the sample showed a moderate probability of conduct problems from 3 and 5 years, with a sharp increase by 7 years, increasing to 50% at 14 years. In terms of individual risk, all factors increased the likelihood of being in the childhood-onset groups compared to the ‘low’ group. For cumulative risk, the socioeconomic domain at 9 months and 3 years, the family domain at all ages except 14 years and child domain at all ages were found to differentiate childhood-onset groups from the ‘low’ group. Cumulative risk at 9 months and 3 years did not differentiate between the ‘school-onset’ group and ‘low’ group. Significant interactions were found between the domains for the ‘early-onset, desisting group’ suggesting that low levels of risk in one domain may buffer the effects of high risk in another domain. The implications of these findings for preventive interventions will be highlighted.

Keywords: conduct problems, cumulative risk, developmental trajectories, early childhood, adolescence

Procedia PDF Downloads 247
5536 Isolation and Culture of Keratinocytes and Fibroblasts to Develop Artificial Skin Equivalent in Cats

Authors: Lavrentiadou S. N., Angelou V., Chatzimisios K., Papazoglou L.

Abstract:

The aim of this study was the isolation and culture of keratinocytes and fibroblasts from feline skin to ultimately create an artificial engineered skin (including dermis and epidermis) useful for the effective treatment of large cutaneous deficits in cats. Epidermal keratinocytes and dermal fibroblasts were freshly isolated from skin biopsies using an 8 mm biopsy punch obtained from 8 healthy cats that had undergone ovariohysterectomy. The owner’s consent was obtained. All cats had a complete blood count and a serum biochemical analysis and were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) preoperatively. The samples were cut into small pieces and incubated with collagenase (2 mg/ml) for 5-6 hours. Following digestion, cutaneous cells were filtered through a 100 μm cell strainer, washed with DMEM, and grown in DMEM supplemented with 10% FBS. The undigested epidermis was washed with DMEM and incubated with 0.05% Trypsin/0.02% EDTA (TE) solution. Keratinocytes recovered in the TE solution were filtered through a 100 μm and a 40 μm cell strainer and, following washing, were grown on a collagen type I matrix in DMEM: F12 (3:1) medium supplemented with 10% FΒS, 1 μm hydrocortisone, 1 μm isoproterenol and 0.1 μm insulin. Both fibroblasts and keratinocytes were grown in a humidified atmosphere with 5% CO2 at 37oC. The medium was changed twice a week and cells were cultured up to passage 4. Cells were grown to 70-85% confluency, at which point they were trypsinized and subcultured in a 1:4 dilution. The majority of the cells in each passage were transferred to a freezing medium and stored at -80oC. Fibroblasts were frozen in DMEM supplemented with 30% FBS and 10% DMSO, whereas keratinocytes were frozen in a complete keratinocyte growth medium supplemented with 10% DMSO. Both cell types were thawed and successfully grown as described above. Therefore, we can create a bank of fibroblasts and keratinocytes, from which we can recover cells for further culture and use for the generation of skin equivalent in vitro. In conclusion, cutaneous cell isolation and cell culture and expansion were successfully developed. To the authors’ best knowledge, this is the first study reporting isolation and culture of keratinocytes and fibroblasts from feline skin. However, these are preliminary results and thus, the development of autologous-engineered feline skin is still in process.

Keywords: cat, fibroblasts, keratinocytes, skin equivalent, wound

Procedia PDF Downloads 100
5535 A Semi-Markov Chain-Based Model for the Prediction of Deterioration of Concrete Bridges in Quebec

Authors: Eslam Mohammed Abdelkader, Mohamed Marzouk, Tarek Zayed

Abstract:

Infrastructure systems are crucial to every aspect of life on Earth. Existing Infrastructure is subjected to degradation while the demands are growing for a better infrastructure system in response to the high standards of safety, health, population growth, and environmental protection. Bridges play a crucial role in urban transportation networks. Moreover, they are subjected to high level of deterioration because of the variable traffic loading, extreme weather conditions, cycles of freeze and thaw, etc. The development of Bridge Management Systems (BMSs) has become a fundamental imperative nowadays especially in the large transportation networks due to the huge variance between the need for maintenance actions, and the available funds to perform such actions. Deterioration models represent a very important aspect for the effective use of BMSs. This paper presents a probabilistic time-based model that is capable of predicting the condition ratings of the concrete bridge decks along its service life. The deterioration process of the concrete bridge decks is modeled using semi-Markov process. One of the main challenges of the Markov Chain Decision Process (MCDP) is the construction of the transition probability matrix. Yet, the proposed model overcomes this issue by modeling the sojourn times based on some probability density functions. The sojourn times of each condition state are fitted to probability density functions based on some goodness of fit tests such as Kolmogorov-Smirnov test, Anderson Darling, and chi-squared test. The parameters of the probability density functions are obtained using maximum likelihood estimation (MLE). The condition ratings obtained from the Ministry of Transportation in Quebec (MTQ) are utilized as a database to construct the deterioration model. Finally, a comparison is conducted between the Markov Chain and semi-Markov chain to select the most feasible prediction model.

Keywords: bridge management system, bridge decks, deterioration model, Semi-Markov chain, sojourn times, maximum likelihood estimation

Procedia PDF Downloads 204
5534 Semantics of the Word “Nas” in the Verse 24 of Surah Al-Baqarah Based on Izutsus’ Semantic Field Theory

Authors: Seyedeh Khadijeh. Mirbazel, Masoumeh Arjmandi

Abstract:

Semantics is a linguistic approach and a scientific stream, and like all scientific streams, it is dynamic. The study of meaning is carried out in the broad semantic collections of words that form the discourse. In other words, meaning is not something that can be found in a word; rather, the formation of meaning is a process that takes place in a discourse as a whole. One of the contemporary semantic theories is Izutsu's Semantic Field Theory. According to this theory, the discovery of meaning depends on the function of words and takes place within the context of language. The purpose of this research is to identify the meaning of the word "Nas" in the discourse of verse 24 of Surah Al-Baqarah, which introduces "Nas" as the firewood of hell, but the translators have translated it as "people". The present research has investigated the semantic structure of the word "Nas" using the aforementioned theory through the descriptive-analytical method. In the process of investigation, by matching the semantic fields of the Quranic word "Nas", this research came to the conclusion that "Nas" implies those persons who have forgotten God and His covenant in believing in His Oneness. For this reason, God called them "Nas (the forgetful)" - the imperfect participle of the noun /næsiwoɔn/ in single trinity of Arabic language, which means “to forget”. Therefore, the intended meaning of "Nas" in the verses that have the word "Nas" is not equivalent to "People" which is a general noun.

Keywords: Nas, people, semantics, semantic field theory.

Procedia PDF Downloads 183
5533 Continuous Wave Interference Effects on Global Position System Signal Quality

Authors: Fang Ye, Han Yu, Yibing Li

Abstract:

Radio interference is one of the major concerns in using the global positioning system (GPS) for civilian and military applications. Interference signals are produced not only through all electronic systems but also illegal jammers. Among different types of interferences, continuous wave (CW) interference has strong adverse impacts on the quality of the received signal. In this paper, we make more detailed analysis for CW interference effects on GPS signal quality. Based on the C/A code spectrum lines, the influence of CW interference on the acquisition performance of GPS receivers is further analysed. This influence is supported by simulation results using GPS software receiver. As the most important user parameter of GPS receivers, the mathematical expression of bit error probability is also derived in the presence of CW interference, and the expression is consistent with the Monte Carlo simulation results. The research on CW interference provides some theoretical gist and new thoughts on monitoring the radio noise environment and improving the anti-jamming ability of GPS receivers.

Keywords: GPS, CW interference, acquisition performance, bit error probability, Monte Carlo

Procedia PDF Downloads 256
5532 Probabilistic and Stochastic Analysis of a Retaining Wall for C-Φ Soil Backfill

Authors: André Luís Brasil Cavalcante, Juan Felix Rodriguez Rebolledo, Lucas Parreira de Faria Borges

Abstract:

A methodology for the probabilistic analysis of active earth pressure on retaining wall for c-Φ soil backfill is described in this paper. The Rosenblueth point estimate method is used to measure the failure probability of a gravity retaining wall. The basic principle of this methodology is to use two point estimates, i.e., the standard deviation and the mean value, to examine a variable in the safety analysis. The simplicity of this framework assures to its wide application. For the calculation is required 2ⁿ repetitions during the analysis, since the system is governed by n variables. In this study, a probabilistic model based on the Rosenblueth approach for the computation of the overturning probability of failure of a retaining wall is presented. The obtained results have shown the advantages of this kind of models in comparison with the deterministic solution. In a relatively easy way, the uncertainty on the wall and fill parameters are taken into account, and some practical results can be obtained for the retaining structure design.

Keywords: retaining wall, active earth pressure, backfill, probabilistic analysis

Procedia PDF Downloads 411
5531 Application of Granular Computing Paradigm in Knowledge Induction

Authors: Iftikhar U. Sikder

Abstract:

This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.

Keywords: concept approximation, granular computing, reducts, rough set theory, rule induction

Procedia PDF Downloads 524
5530 Conflation Methodology Applied to Flood Recovery

Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: community resilience, conflation, flood risk, nuisance flooding

Procedia PDF Downloads 96
5529 Stochastic Prioritization of Dependent Actuarial Risks: Preferences among Prospects

Authors: Ezgi Nevruz, Kasirga Yildirak, Ashis SenGupta

Abstract:

Comparing or ranking risks is the main motivating factor behind the human trait of making choices. Cumulative prospect theory (CPT) is a preference theory approach that evaluates perception and bias in decision making under risk and uncertainty. We aim to investigate the aggregate claims of different risk classes in terms of their comparability and amenability to ordering when the impact of risk perception is considered. For this aim, we prioritize the aggregate claims taken as actuarial risks by using various stochastic ordering relations. In order to prioritize actuarial risks, we use stochastic relations such as stochastic dominance and stop-loss dominance that are proposed in the frame of partial order theory. We take into account the dependency of the individual claims exposed to similar environmental risks. At first, we modify the zero-utility premium principle in order to obtain a solution for the stop-loss premium under CPT. Then, we propose a stochastic stop-loss dominance of the aggregate claims and find a relation between the stop-loss dominance and the first-order stochastic dominance under the dependence assumption by using properties of the familiar as well as some emerging multivariate claim distributions.

Keywords: cumulative prospect theory, partial order theory, risk perception, stochastic dominance, stop-loss dominance

Procedia PDF Downloads 317
5528 Vibration Analysis of Stepped Nanoarches with Defects

Authors: Jaan Lellep, Shahid Mubasshar

Abstract:

A numerical solution is developed for simply supported nanoarches based on the non-local theory of elasticity. The nanoarch under consideration has a step-wise variable cross-section and is weakened by crack-like defects. It is assumed that the cracks are stationary and the mechanical behaviour of the nanoarch can be modeled by Eringen’s non-local theory of elasticity. The physical and thermal properties are sensitive with respect to changes of dimensions in the nano level. The classical theory of elasticity is unable to describe such changes in material properties. This is because, during the development of the classical theory of elasticity, the speculation of molecular objects was avoided. Therefore, the non-local theory of elasticity is applied to study the vibration of nanostructures and it has been accepted by many researchers. In the non-local theory of elasticity, it is assumed that the stress state of the body at a given point depends on the stress state of each point of the structure. However, within the classical theory of elasticity, the stress state of the body depends only on the given point. The system of main equations consists of equilibrium equations, geometrical relations and constitutive equations with boundary and intermediate conditions. The system of equations is solved by using the method of separation of variables. Consequently, the governing differential equations are converted into a system of algebraic equations whose solution exists if the determinant of the coefficients of the matrix vanishes. The influence of cracks and steps on the natural vibration of the nanoarches is prescribed with the aid of additional local compliance at the weakened cross-section. An algorithm to determine the eigenfrequencies of the nanoarches is developed with the help of computer software. The effects of various physical and geometrical parameters are recorded and drawn graphically.

Keywords: crack, nanoarches, natural frequency, step

Procedia PDF Downloads 126
5527 U.S. Supreme Court Decision-Making and Bounded Rationality

Authors: Joseph Ignagni, Rebecca Deen

Abstract:

In this study, the decision making of the Justices of the United States Supreme Court will be considered in terms of constrained maximization and cognitive-cybernetic theory. This paper will integrate research in such fields as law, psychology, political science, economics and decision-making theory. It will be argued that due to its heavy workload, the Supreme Court may be forced to make decisions in a boundedly rational manner. The ideas and theory put forward here will be considered in the area of the Court’s decisions involving religion. Therefore, the cases involving the U.S. Constitution’s Free Exercise Clause and Establishment Clause will be analyzed.

Keywords: bounded rationality, cognitive-cybernetic, US supreme court, religion

Procedia PDF Downloads 379
5526 Applying Bowen’s Theory to Intern Supervision

Authors: Jeff A. Tysinger, Dawn P. Tysinger

Abstract:

The aim of this paper is to theoretically apply Bowen’s understanding of triangulation and triads to school psychology intern supervision so that it can assist in the conceptualization of the dynamics of intern supervision and provide some key methods to address common issues. The school psychology internship is the capstone experience for the school psychologist in training. It involves three key participants whose relationships will determine the success of the internship.  To understand the potential effect, Bowen’s family systems theory can be applied to the supervision relationship. He describes a way to resolve stress between two people by triangulating or binging in a third person. He applies this to a nuclear family, but school psychology intern supervision requires the marriage of an intern, field supervisor, and university supervisor; thus, setting all up for possible triangulation. The consequences of triangulation can apply to standards and requirements, direct supervision, and intern evaluation. Strategies from family systems theory to decrease the negative impact of supervision triangulation.

Keywords: family systems theory, intern supervision, school psychology training, triangulation

Procedia PDF Downloads 119
5525 Analytical Downlink Effective SINR Evaluation in LTE Networks

Authors: Marwane Ben Hcine, Ridha Bouallegue

Abstract:

The aim of this work is to provide an original analytical framework for downlink effective SINR evaluation in LTE networks. The classical single carrier SINR performance evaluation is extended to multi-carrier systems operating over frequency selective channels. Extension is achieved by expressing the link outage probability in terms of the statistics of the effective SINR. For effective SINR computation, the exponential effective SINR mapping (EESM) method is used on this work. Closed-form expression for the link outage probability is achieved assuming a log skew normal approximation for single carrier case. Then we rely on the lognormal approximation to express the exponential effective SINR distribution as a function of the mean and standard deviation of the SINR of a generic subcarrier. Achieved formulas is easily computable and can be obtained for a user equipment (UE) located at any distance from its serving eNodeB. Simulations show that the proposed framework provides results with accuracy within 0.5 dB.

Keywords: LTE, OFDMA, effective SINR, log skew normal approximation

Procedia PDF Downloads 359
5524 Analyzing Soviet and Post-Soviet Contemporary Russian Foreign Policy by Applying the Theory of Political Realism

Authors: Simon Tsipis

Abstract:

In this study, we propose to analyze Russian foreign policy conduct by applying the theory of Political Realism and the qualitative comparative method of analysis. We find that the paradigm of Political Realism supplies us with significant insights into the sources of contemporary Russian foreign policy conduct since the power factor was and remains an integral element in Russian foreign policies, especially when we apply comparative analysis and compare it with the behavior of its Soviet predecessor. Through the lens of the Realist theory, a handful of Russian foreign policy-making becomes clearer and much more comprehensible.

Keywords: realism, Russia, cold war, Soviet Union, European security

Procedia PDF Downloads 108
5523 Contested Visions of Exploration in IR: Theoretical Engagements, Reflections and New Agendas on the Dynamics of Global Order

Authors: Ananya Sharma

Abstract:

International Relations is a discipline of paradoxes. The State is the dominant political institution, with mainstream analysis theorizing the State, but theory remains at best a reactionary monolith. Critical Theorists have been pushing the envelope and to that extent, there has been a clear shift in the dominant discourse away from State-centrism to individuals and group-level behaviour. This paradigm shift has been accompanied with more nuanced conceptualizations of other variables at play–power, security, and trust, to name a few. Yet, the ambit of “what is discussed” remains primarily embedded in realist conceptualizations. With this background in mind, this paper will attempt to understand, juxtapose and evaluate how “order” has been conceptualized in International Relations theory. This paper is a tentative attempt to present a “state of the art” and in the process, set the stage for a deeper study to draw attention to what the author feels is a gaping lacuna in IR theory. The paper looks at how different branches of international relations theory envisage world order and the silences embedded therein. Further, by locating order and disorder inhabiting the same reality along a continuum, alternative readings of world orders are drawn from the critical theoretical traditions, in which various articulations of justice impart the key normative pillar to the world order.

Keywords: global justice, international relations theory, legitimacy, world order

Procedia PDF Downloads 340
5522 Leadership Strategies in Social Enterprises through Reverse Accountability: Analysis of Social Control for Pragmatic Organizational Design

Authors: Ananya Rajagopal

Abstract:

The study is based on an analysis of qualitative data used to analyze the business performance of entrepreneurs in emerging markets based on core variables such as collective leadership in reference to social entrepreneurship and reverse accountability attributes of stakeholders. In-depth interviews were conducted with 25 emerging enterprises within Mexico across five industrial segments. The study has been conducted focusing on five major research questions, which helped in developing the grounded theory related to reverser accountability. The results of the study revealed that the traditional entrepreneurship model based on an individualistic leadership style is being replaced by a collective leadership model. The study focuses on the leadership styles within social enterprises aimed at enhancing managerial capabilities and competencies, stakeholder values, and entrepreneurial growth. The theoretical motivation of this study has been derived from stakeholder theory and agency theory.

Keywords: reverse accountability, social enterprises, collective leadership, grounded theory, social governance

Procedia PDF Downloads 112
5521 Tax Evasion in Brazil: The Case of Specialists

Authors: Felippe Clemente, Viviani S. Lírio

Abstract:

Brazilian tax evasion is very high. It causes many problems for economics as budget realization, income distribution and no allocation of productive resources. Therefore, the purpose of this article is to use the instrumental game theory to understand tax evasion agents and tax authority in Brazil (Federal Revenue and Federal Police). By means of Game Theory approaches, the main results from considering cases both with and without specialists show that, in a high dropout situation, penalizing taxpayers with either high fines or deprivations of liberty may not be very effective. The analysis also shows that audit and inspection costs play an important role in driving the equilibrium system. This would suggest that a policy of investing in tax inspectors would be a more effective tool in combating non-compliance with tax obligations than penalties or fines.

Keywords: tax evasion, Brazil, game theory, specialists

Procedia PDF Downloads 321
5520 Individualism/Collectivism and Extended Theory of Planned Behavior

Authors: Ela Ari, Aysi̇ma Findikoglu

Abstract:

Consumers’ switching GSM operators’ has been an important research issue since the rise of their competitive offers. Recent research has looked at consumer switching behavior through the theory of planned behavior, but not yet extended the theory with identity, psycho-social and cultural influences within the service context. This research explores an extended version of the theory of planned behavior including social and financial risks and brand loyalty. Moreover, the role of individualism and collectivism at the individual level is investigated in a collectivistic culture that moves toward to individualism due to changing family relationships, use of technology and education. Our preliminary analysis showed that financial risk and vertical individualism prove to be a significant determinant of intention to switch. The study also investigates social risk and intention, subjective norm, perceived behavioral control relationship. The effect of individualism and collectivism and attitudes relationship has been also examined within a service industry. Implications for marketing managers and scholars are also discussed.

Keywords: attitude, individualism, intention, subjective norm

Procedia PDF Downloads 449
5519 The Postcognitivist Era in Cognitive Psychology

Authors: C. Jameke

Abstract:

During the cognitivist era in cognitive psychology, a theory of internal rules and symbolic representations was posited as an account of human cognition. This type of cognitive architecture had its heyday during the 1970s and 80s, but it has now been largely abandoned in favour of subsymbolic architectures (e.g. connectionism), non-representational frameworks (e.g. dynamical systems theory), and statistical approaches such as Bayesian theory. In this presentation I describe this changing landscape of research, and comment on the increasing influence of neuroscience on cognitive psychology. I then briefly review a few recent developments in connectionism, and neurocomputation relevant to cognitive psychology, and critically discuss the assumption made by some researchers in these frameworks that higher-level aspects of human cognition are simply emergent properties of massively large distributed neural networks

Keywords: connectionism, emergentism, postocgnitivist, representations, subsymbolic archiitecture

Procedia PDF Downloads 567
5518 Free Vibration of Functionally Graded Smart Beams Based on the First Order Shear Deformation Theory

Authors: A. R. Nezamabadi, M. Veiskarami

Abstract:

This paper studies free vibration of simply supported functionally graded beams with piezoelectric layers based on the first order shear deformation theory. The Young's modulus of beam is assumed to be graded continuously across the beam thickness. The governing equation is established. Resulting equation is solved using the Euler's equation. The effects of the constituent volume fractions, the influences of applied voltage on the vibration frequency are presented. To investigate the accuracy of the present analysis, a compression study is carried out with a known data.

Keywords: mechanical buckling, functionally graded beam, first order shear deformation theory, free vibration

Procedia PDF Downloads 471
5517 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: reliability approach, storage tanks, monte carlo simulation, seismic acceleration

Procedia PDF Downloads 304
5516 The Economics of Justice as Fairness

Authors: Antonio Abatemarco, Francesca Stroffolini

Abstract:

In the economic literature, Rawls’ Theory of Justice is usually interpreted in a two-stage setting, where a priority to the worst off individual is imposed as a distributive value judgment. In this paper, instead, we model Rawls’ Theory in a three-stage setting, that is, a separating line is drawn between the original position, the educational stage, and the working life. Hence, in this paper, we challenge the common interpretation of Rawls’ Theory of Justice as Fairness by showing that this Theory goes well beyond the definition of a distributive value judgment, in such a way as to embrace efficiency issues as well. In our model, inequalities are shown to be permitted as far as they stimulate a greater effort in education in the population, and so economic growth. To our knowledge, this is the only possibility for the inequality to be ‘bought’ by both the most-, and above all, the least-advantaged individual as suggested by the Difference Principle. Finally, by recalling the old tradition of ‘universal ex-post efficiency’, we show that a unique optimal social contract does not exist behind the veil of ignorance; more precisely, the sole set of potentially Rawls-optimal social contracts can be identified a priori, and partial justice orderings derived accordingly.

Keywords: justice, Rawls, inequality, social contract

Procedia PDF Downloads 217
5515 Beyond Adoption: Econometric Analysis of Impacts of Farmer Innovation Systems and Improved Agricultural Technologies on Rice Yield in Ghana

Authors: Franklin N. Mabe, Samuel A. Donkoh, Seidu Al-Hassan

Abstract:

In order to increase and bridge the differences in rice yield, many farmers have resorted to adopting Farmer Innovation Systems (FISs) and Improved Agricultural Technologies (IATs). This study econometrically analysed the impacts of adoption of FISs and IATs on rice yield using multinomial endogenous switching regression (MESR). Nine-hundred and seven (907) rice farmers from Guinea Savannah Zone (GSZ), Forest Savannah Transition Zone (FSTZ) and Coastal Savannah Zone (CSZ) were used for the study. The study used both primary and secondary data. FBO advice, rice farming experience and distance from farming communities to input markets increase farmers’ adoption of only FISs. Factors that increase farmers’ probability of adopting only IATs are access to extension advice, credit, improved seeds and contract farming. Farmers located in CSZ have higher probability of adopting only IATs than their counterparts living in other agro-ecological zones. Age and access to input subsidy increase the probability of jointly adopting FISs and IATs. FISs and IATs have heterogeneous impact on rice yield with adoption of only IATs having the highest impact followed by joint adoption of FISs and IATs. It is important for stakeholders in rice subsector to champion the provision of improved rice seeds, the intensification of agricultural extension services and contract farming concept. Researchers should endeavour to researched into FISs.

Keywords: farmer innovation systems, improved agricultural technologies, multinomial endogenous switching regression, treatment effect

Procedia PDF Downloads 415
5514 Enhancement of Primary User Detection in Cognitive Radio by Scattering Transform

Authors: A. Moawad, K. C. Yao, A. Mansour, R. Gautier

Abstract:

The detecting of an occupied frequency band is a major issue in cognitive radio systems. The detection process becomes difficult if the signal occupying the band of interest has faded amplitude due to multipath effects. These effects make it hard for an occupying user to be detected. This work mitigates the missed-detection problem in the context of cognitive radio in frequency-selective fading channel by proposing blind channel estimation method that is based on scattering transform. By initially applying conventional energy detection, the missed-detection probability is evaluated, and if it is greater than or equal to 50%, channel estimation is applied on the received signal followed by channel equalization to reduce the channel effects. In the proposed channel estimator, we modify the Morlet wavelet by using its first derivative for better frequency resolution. A mathematical description of the modified function and its frequency resolution is formulated in this work. The improved frequency resolution is required to follow the spectral variation of the channel. The channel estimation error is evaluated in the mean-square sense for different channel settings, and energy detection is applied to the equalized received signal. The simulation results show improvement in reducing the missed-detection probability as compared to the detection based on principal component analysis. This improvement is achieved at the expense of increased estimator complexity, which depends on the number of wavelet filters as related to the channel taps. Also, the detection performance shows an improvement in detection probability for low signal-to-noise scenarios over principal component analysis- based energy detection.

Keywords: channel estimation, cognitive radio, scattering transform, spectrum sensing

Procedia PDF Downloads 190
5513 Passenger Preferences on Airline Check-In Methods: Traditional Counter Check-In Versus Common-Use Self-Service Kiosk

Authors: Cruz Queen Allysa Rose, Bautista Joymeeh Anne, Lantoria Kaye, Barretto Katya Louise

Abstract:

The study presents the preferences of passengers on the quality of service provided by the two airline check-in methods currently present in airports-traditional counter check-in and common-use self-service kiosks. Since a study has shown that airlines perceive self-service kiosks alone are sufficient enough to ensure adequate services and customer satisfaction, and in contrast, agents and passengers stated that it alone is not enough and that human interaction is essential. In reference with former studies that established opposing ideas about the choice of the more favorable airline check-in method to employ, it is the purpose of this study to present a recommendation that shall somehow fill-in the gap between the conflicting ideas by means of comparing the perceived quality of service through the RATER model. Furthermore, this study discusses the major competencies present in each method which are supported by the theories–FIRO Theory of Needs upholding the importance of inclusion, control and affection, and the Queueing Theory which points out the discipline of passengers and the length of the queue line as important factors affecting quality service. The findings of the study were based on the data gathered by the researchers from selected Thomasian third year and fourth year college students currently enrolled in the first semester of the academic year 2014-2015, who have already experienced both airline check-in methods through the implication of a stratified probability sampling. The statistical treatments applied in order to interpret the data were mean, frequency, standard deviation, t-test, logistic regression and chi-square test. The final point of the study revealed that there is a greater effect in passenger preference concerning the satisfaction experienced in common-use self-service kiosks in comparison with the application of the traditional counter check-in.

Keywords: traditional counter check-in, common-use self-service Kiosks, airline check-in methods

Procedia PDF Downloads 402
5512 Internet Shopping: A Study Based On Hedonic Value and Flow Theory

Authors: Pui-Lai To, E-Ping Sung

Abstract:

With the flourishing development of online shopping, an increasing number of customers see online shopping as an entertaining experience. Because the online consumer has a double identity as a shopper and an Internet user, online shopping should offer hedonic values of shopping and Internet usage. The purpose of this study is to investigate hedonic online shopping motivations from the perspectives of traditional hedonic value and flow theory. The study adopted a focus group interview method, including two online and two offline interviews. Four focus groups of shoppers consisted of online professionals, online college students, offline professionals and offline college students. The results of the study indicate that traditional hedonic values and dimensions of flow theory exist in the online shopping environment. The study indicated that online shoppers seem to appreciate being able to learn things and grow to become competitive achievers online. Comparisons of online hedonic motivations between groups are conducted. This study serves as a basis for the future growth of Internet marketing.

Keywords: flow theory, hedonic motivation, internet shopping

Procedia PDF Downloads 273