Search results for: expectation theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4839

Search results for: expectation theory

3399 The Effect of Socio-Affective Variables in the Relationship between Organizational Trust and Employee Turnover Intention

Authors: Paula A. Cruise, Carvell McLeary

Abstract:

Employee turnover leads to lowered productivity, decreased morale and work quality, and psychological effects associated with employee separation and replacement. Yet, it remains unknown why talented employees willingly withdraw from organizations. This uncertainty is worsened as studies; a) priorities organizational over individual predictors resulting in restriction in range in turnover measurement; b) focus on actual rather than intended turnover thereby limiting conceptual understanding of the turnover construct and its relationship with other variables and; c) produce inconsistent findings across cultures, contexts and industries despite a clear need for a unified perspective. The current study addressed these gaps by adopting the theory of planned behavior (TPB) framework to examine socio-cognitive factors in organizational trust and individual turnover intentions among bankers and energy employees in Jamaica. In a comparative study of n=369 [nbank= 264; male=57 (22.73%); nenergy =105; male =45 (42.86)], it was hypothesized that organizational trust was a predictor of employee turnover intention, and the effect of individual, group, cognitive and socio-affective variables varied across industry. Findings from structural equation modelling confirmed the hypothesis, with a model of both cognitive and socio-affective variables being a better fit [CMIN (χ2) = 800.067, df = 364, p ≤ .000; CFI = 0.950; RMSEA = 0.057 with 90% C.I. (0.052 - 0.062); PCLOSE = 0.016; PNFI = 0.818 in predicting turnover intention. The findings are discussed in relation to socio-cognitive components of trust models and predicting negative employee behaviors across cultures and industries.

Keywords: context-specific organizational trust, cross-cultural psychology, theory of planned behavior, employee turnover intention

Procedia PDF Downloads 220
3398 Effect of Gas Boundary Layer on the Stability of a Radially Expanding Liquid Sheet

Authors: Soumya Kedia, Puja Agarwala, Mahesh Tirumkudulu

Abstract:

Linear stability analysis is performed for a radially expanding liquid sheet in the presence of a gas medium. A liquid sheet can break up because of the aerodynamic effect as well as its thinning. However, the study of the aforementioned effects is usually done separately as the formulation becomes complicated and is difficult to solve. Present work combines both, aerodynamic effect and thinning effect, ignoring the non-linearity in the system. This is done by taking into account the formation of the gas boundary layer whilst neglecting viscosity in the liquid phase. Axisymmetric flow is assumed for simplicity. Base state analysis results in a Blasius-type system which can be solved numerically. Perturbation theory is then applied to study the stability of the liquid sheet, where the gas-liquid interface is subjected to small deformations. The linear model derived here can be applied to investigate the instability for sinuous as well as varicose modes, where the former represents displacement in the centerline of the sheet and the latter represents modulation in sheet thickness. Temporal instability analysis is performed for sinuous modes, which are significantly more unstable than varicose modes, for a fixed radial distance implying local stability analysis. The growth rates, measured for fixed wavenumbers, predicated by the present model are significantly lower than those obtained by the inviscid Kelvin-Helmholtz instability and compare better with experimental results. Thus, the present theory gives better insight into understanding the stability of a thin liquid sheet.

Keywords: boundary layer, gas-liquid interface, linear stability, thin liquid sheet

Procedia PDF Downloads 205
3397 CFD Simulation Approach for Developing New Powder Dispensing Device

Authors: Revanth Rallapalli

Abstract:

Manually dispensing powders can be difficult as it requires gradually pouring and checking the amount on the scale to be dispensed. Current systems are manual and non-continuous in nature and are user-dependent and difficult to control powder dispensation. Recurrent dosing of powdered medicines in precise amounts quickly and accurately has been an all-time challenge. Various new powder dispensing mechanisms are being designed to overcome these challenges. A battery-operated screw conveyor mechanism is being innovated to overcome the above problems faced. These inventions are numerically evaluated at the concept development level by employing Computational Fluid Dynamics (CFD) of gas-solids multiphase flow systems. CFD has been very helpful in the development of such devices saving time and money by reducing the number of prototypes and testing. This paper describes a simulation of powder dispensation from the trocar’s end by considering the powder as secondary flow in the air, is simulated by using the technique called Dense Discrete Phase Model incorporated with Kinetic Theory of Granular Flow (DDPM-KTGF). By considering the volume fraction of powder as 50%, the transportation of powder from the inlet side to the trocar’s end side is done by rotation of the screw conveyor. The performance is calculated for a 1-sec time frame in an unsteady computation manner. This methodology will help designers in developing design concepts to improve the dispensation and the effective area within a quick turnaround time frame.

Keywords: multiphase flow, screw conveyor, transient, dense discrete phase model (DDPM), kinetic theory of granular flow (KTGF)

Procedia PDF Downloads 127
3396 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 180
3395 Non−zero θ_13 and δ_CP phase with A_4 Flavor Symmetry and Deviations to Tri−Bi−Maximal mixing via Z_2 × Z_2 invariant perturbations in the Neutrino sector.

Authors: Gayatri Ghosh

Abstract:

In this work, a flavour theory of a neutrino mass model based on A_4 symmetry is considered to explain the phenomenology of neutrino mixing. The spontaneous symmetry breaking of A_4 symmetry in this model leads to tribimaximal mixing in the neutrino sector at a leading order. We consider the effect of Z_2 × Z_2 invariant perturbations in neutrino sector and find the allowed region of correction terms in the perturbation matrix that is consistent with 3σ ranges of the experimental values of the mixing angles. We study the entanglement of this formalism on the other phenomenological observables, such as δ_CP phase, the neutrino oscillation probability P(νµ → νe), the effective Majorana mass |mee| and |meff νe |. A Z_2 × Z_2 invariant perturbations in this model is introduced in the neutrino sector which leads to testable predictions of θ_13 and CP violation. By changing the magnitudes of perturbations in neutrino sector, one can generate viable values of δ_CP and neutrino oscillation parameters. Next we investigate the feasibility of charged lepton flavour violation in type-I seesaw models with leptonic flavour symmetries at high energy that leads to tribimaximal neutrino mixing. We consider an effective theory with an A_4 × Z_2 × Z_2 symmetry, which after spontaneous symmetry breaking at high scale which is much higher than the electroweak scale leads to charged lepton flavour violation processes once the heavy Majorana neutrino mass degeneracy is lifted either by renormalization group effects or by a soft breaking of the A_4 symmetry. In this context the implications for charged lepton flavour violation processes like µ → eγ, τ → eγ, τ → µγ are discussed.

Keywords: Z2 × Z2 invariant perturbations, CLFV, delta CP phase, tribimaximal neutrino mixing

Procedia PDF Downloads 60
3394 Understanding the Challenges of Lawbook Translation via the Framework of Functional Theory of Language

Authors: Tengku Sepora Tengku Mahadi

Abstract:

Where the speed of book writing lags behind the high need for such material for tertiary studies, translation offers a way to enhance the equilibrium in this demand-supply equation. Nevertheless, translation is confronted by obstacles that threaten its effectiveness. The primary challenge to the production of efficient translations may well be related to the text-type and in terms of its complexity. A text that is intricately written with unique rhetorical devices, subject-matter foundation and cultural references will undoubtedly challenge the translator. Longer time and greater effort would be the consequence. To understand these text-related challenges, the present paper set out to analyze a lawbook entitled Learning the Law by David Melinkoff. The book is chosen because it has often been used as a textbook or for reference in many law courses in the United Kingdom and has seen over thirteen editions; therefore, it can be said to be a worthy book for studies in law. Another reason is the existence of a ready translation in Malay. Reference to this translation enables confirmation to some extent of the potential problems that might occur in its translation. Understanding the organization and the language of the book will help translators to prepare themselves better for the task. They can anticipate the research and time that may be needed to produce an effective translation. Another premise here is that this text-type implies certain ways of writing and organization. Accordingly, it seems practicable to adopt the functional theory of language as suggested by Michael Halliday as its theoretical framework. Concepts of the context of culture, the context of situation and measures of the field, tenor and mode form the instruments for analysis. Additional examples from similar materials can also be used to validate the findings. Some interesting findings include the presence of several other text-types or sub-text-types in the book and the dependence on literary discourse and devices to capture the meanings better or add color to the dry field of law. In addition, many elements of culture can be seen, for example, the use of familiar alternatives, allusions, and even terminology and references that date back to various periods of time and languages. Also found are parts which discuss origins of words and terms that may be relevant to readers within the United Kingdom but make little sense to readers of the book in other languages. In conclusion, the textual analysis in terms of its functions and the linguistic and textual devices used to achieve them can then be applied as a guide to determine the effectiveness of the translation that is produced.

Keywords: functional theory of language, lawbook text-type, rhetorical devices, culture

Procedia PDF Downloads 126
3393 Adding a Few Language-Level Constructs to Improve OOP Verifiability of Semantic Correctness

Authors: Lian Yang

Abstract:

Object-oriented programming (OOP) is the dominant programming paradigm in today’s software industry and it has literally enabled average software developers to develop millions of commercial strength software applications in the era of INTERNET revolution over the past three decades. On the other hand, the lack of strict mathematical model and domain constraint features at the language level has long perplexed the computer science academia and OOP engineering community. This situation resulted in inconsistent system qualities and hard-to-understand designs in some OOP projects. The difficulties with regards to fix the current situation are also well known. Although the power of OOP lies in its unbridled flexibility and enormously rich data modeling capability, we argue that the ambiguity and the implicit facade surrounding the conceptual model of a class and an object should be eliminated as much as possible. We listed the five major usage of class and propose to separate them by proposing new language constructs. By using well-established theories of set and FSM, we propose to apply certain simple, generic, and yet effective constraints at OOP language level in an attempt to find a possible solution to the above-mentioned issues regarding OOP. The goal is to make OOP more theoretically sound as well as to aid programmers uncover warning signs of irregularities and domain-specific issues in applications early on the development stage and catch semantic mistakes at runtime, improving correctness verifiability of software programs. On the other hand, the aim of this paper is more practical than theoretical.

Keywords: new language constructs, set theory, FSM theory, user defined value type, function groups, membership qualification attribute (MQA), check-constraint (CC)

Procedia PDF Downloads 222
3392 Kantian Epistemology in Examination of the Axiomatic Principles of Economics: The Synthetic a Priori in the Economic Structure of Society

Authors: Mirza Adil Ahmad Mughal

Abstract:

Transcendental analytics, in the critique of pure reason, combines space and time as conditions of the possibility of the phenomenon from the transcendental aesthetic with the pure magnitude-intuition notion. The property of continuity as a qualitative result of the additive magnitude brings the possibility of connecting with experience, even though only as a potential because of the a priori necessity from assumption, as syntheticity of the a priori task of a scientific method of philosophy given by Kant, which precludes the application of categories to something not empirically reducible to the content of such a category's corresponding and possible object. This continuity as the qualitative result of a priori constructed notion of magnitude lies as a fundamental assumption and property of, what in Microeconomic theory is called as, 'choice rules' which combine the potentially-empirical and practical budget-price pairs with preference relations. This latter result is the purest qualitative side of the choice rules', otherwise autonomously, quantitative nature. The theoretical, barring the empirical, nature of this qualitative result is a synthetic a priori truth, which, if at all, it should be, if the axiomatic structure of the economic theory is held to be correct. It has a potentially verifiable content as its possible object in the form of quantitative price-budget pairs. Yet, the object that serves the respective Kantian category is qualitative itself, which is utility. This article explores the validity of Kantian qualifications for this application of 'categories' to the economic structure of society.

Keywords: categories of understanding, continuity, convexity, psyche, revealed preferences, synthetic a priori

Procedia PDF Downloads 78
3391 Production of New Hadron States in Effective Field Theory

Authors: Qi Wu, Dian-Yong Chen, Feng-Kun Guo, Gang Li

Abstract:

In the past decade, a growing number of new hadron states have been observed, which are dubbed as XYZ states in the heavy quarkonium mass regions. In this work, we present our study on the production of some new hadron states. In particular, we investigate the processes Υ(5S,6S)→ Zb (10610)/Zb (10650)π, Bc→ Zc (3900)/Zc (4020)π and Λb→ Pc (4312)/Pc (4440)/Pc (4457)K. (1) For the production of Zb (10610)/Zb (10650) from Υ(5S,6S) decay, two types of bottom-meson loops were discussed within a nonrelativistic effective field theory. We found that the loop contributions with all intermediate states being the S-wave ground state bottom mesons are negligible, while the loops with one bottom meson being the broad B₀* or B₁' resonance could provide the dominant contributions to the Υ(5S)→ Zb⁽'⁾ π. (2) For the production of Zc (3900)/Zc (4020) from Bc decay, the branching ratios of Bc⁺→ Z (3900)⁺ π⁰ and Bc⁺→ Zc (4020)⁺ π⁰ are estimated to be of order of 10⁽⁻⁴⁾ and 10⁽⁻⁷⁾ in an effective Lagrangian approach. The large production rate of Zc (3900) could provide an important source of the production of Zc (3900) from the semi-exclusive decay of b-flavored hadrons reported by D0 Collaboration, which can be tested by the exclusive measurements in LHCb. (3) For the production of Pc (4312), Pc (4440) and Pc (4457) from Λb decay, the ratio of the branching fraction of Λb→ Pc K was predicted in a molecular scenario by using an effective Lagrangian approach, which is weakly dependent on our model parameter. We also find the ratios of the productions of the branching fractions of Λb→ Pc K and Pc→ J/ψ p can be well interpreted in the molecular scenario. Moreover, the estimated branching fractions of Λb→ Pc K are of order 10⁽⁻⁶⁾, which could be tested by further measurements in LHCb Collaboration.

Keywords: effective Lagrangian approach, hadron loops, molecular states, new hadron states

Procedia PDF Downloads 108
3390 A Theoretical Framework of Patient Autonomy in a High-Tech Care Context

Authors: Catharina Lindberg, Cecilia Fagerstrom, Ania Willman

Abstract:

Patients in high-tech care environments are usually dependent on both formal/informal caregivers and technology, highlighting their vulnerability and challenging their autonomy. Autonomy presumes that a person has education, experience, self-discipline and decision-making capacity. Reference to autonomy in relation to patients in high-tech care environments could, therefore, be considered paradoxical, as in most cases these persons have impaired physical and/or metacognitive capacity. Therefore, to understand the prerequisites for patients to experience autonomy in high-tech care environments and to support them, there is a need to enhance knowledge and understanding of the concept of patient autonomy in this care context. The development of concepts and theories in a practice discipline such as nursing helps to improve both nursing care and nursing education. Theoretical development is important when clarifying a discipline, hence, a theoretical framework could be of use to nurses in high-tech care environments to support and defend the patient’s autonomy. A meta-synthesis was performed with the intention to be interpretative and not aggregative in nature. An amalgamation was made of the results from three previous studies, carried out by members of the same research group, focusing on the phenomenon of patient autonomy from a patient perspective within a caring context. Three basic approaches to theory development: derivation, synthesis, and analysis provided an operational structure that permitted the researchers to move back and forth between these approaches during their work in developing a theoretical framework. The results from the synthesis delineated that patient autonomy in a high-tech care context is: To be in control though trust, co-determination, and transition in everyday life. The theoretical framework contains several components creating the prerequisites for patient autonomy. Assumptions and propositional statements that guide theory development was also outlined, as were guiding principles for use in day-to-day nursing care. Four strategies used by patients to remain or obtain patient autonomy in high-tech care environments were revealed: the strategy of control, the strategy of partnership, the strategy of trust, and the strategy of transition. This study suggests an extended knowledge base founded on theoretical reasoning about patient autonomy, providing an understanding of the strategies used by patients to achieve autonomy in the role of patient, in high-tech care environments. When possessing knowledge about the patient perspective of autonomy, the nurse/carer can avoid adopting a paternalistic or maternalistic approach. Instead, the patient can be considered to be a partner in care, allowing care to be provided that supports him/her in remaining/becoming an autonomous person in the role of patient.

Keywords: autonomy, caring, concept development, high-tech care, theory development

Procedia PDF Downloads 187
3389 Taleb's Complexity Theory Concept of 'Antifragility' Has a Significant Contribution to Make to Positive Psychology as Applied to Wellbeing

Authors: Claudius Peter Van Wyk

Abstract:

Given the increasingly manifest phenomena, as described in complexity theory, of volatility, uncertainty, complexity and ambiguity (VUCA), Taleb's notion of 'antifragility, has a significant contribution to make to positive psychology applied to wellbeing. Antifragility is argued to be fundamentally different from the concepts of resiliency; as the ability to recover from failure, and robustness; as the ability to resist failure. Rather it describes the capacity to reorganise in the face of stress in such a way as to cope more effectively with systemic challenges. The concept, which has been applied in disciplines ranging from physics, molecular biology, planning, engineering, and computer science, can now be considered for its application in individual human and social wellbeing. There are strong correlations to Antonovsky's model of 'salutogenesis' in which an attitude and competencies are developed of transforming burdening factors into greater resourcefulness. We demonstrate, from the perspective of neuroscience, how technology measuring nervous system coherence can be coupled to acquired psychodynamic approaches to not only identify contextual stressors, utilise biofeedback instruments for facilitating greater coherence, but apply these insights to specific life stressors that compromise well-being. Employing an on-going case study with BMW South Africa, the neurological mapping is demonstrated together with 'reframing' and emotional anchoring techniques from neurolinguistic programming. The argument is contextualised in the discipline of psychoneuroimmunology which describes the stress pathways from the CNS and endocrine systems and their impact on immune function and the capacity to restore homeostasis.

Keywords: antifragility, complexity, neuroscience, psychoneuroimmunology, salutogenesis, volatility

Procedia PDF Downloads 348
3388 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model

Authors: Donatella Giuliani

Abstract:

In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.

Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation

Procedia PDF Downloads 199
3387 Narcissism and Kohut's Self-Psychology: Self Practices in Service of Self-Transcendence

Authors: Noelene Rose

Abstract:

The DSM has been plagued with conceptual issues since its inception, not least discriminant validity and comorbidity issues. An attempt to remain a-theoretical in the divide between the psycho-dynamicists and the behaviourists contributed to much of this, in particular relating to the Personality Disorders. With the DSM-5, although the criterion have remained unchanged, major conceptual and structural directions have been flagged and proposed in section III. The biggest changes concern the Personality Disorders. While Narcissistic Personality Disorder (NPD) was initially tagged for removal, instead the addition of section III proposes a move away from a categorical approach to a more dimensional approach, with a measure of Global Function of Personality. This global measure is an assessment of impairment of self-other relations; a measure of trait narcissism. In the same way mainstream psychology has struggled in its diagnosis of narcissism, so too in its treatment. Kohut’s self psychology represents the most significant inroad in theory and treatment for the narcissistic disorders. Kohut had moved away from a categorical system, towards disorders of the self. According to this theory, disorders of the self are the result of childhood trauma (impaired attunement) resulting in a developmental arrest. Self-psychological, Psychodynamic treatment of narcissism, however, is expensive, in time and money and outside the awareness or access of most people. There is more than a suggestion that narcissism is on the increase, created in trauma and worsened by a fearful world climate. A dimensional model of narcissism, from mild to severe, requires cut off points for diagnosis. But where do we draw the line? Mainstream psychology is inclined to set it high when there is some degree of impairment in functioning in daily life. Transpersonal Psychology is inclined to set it low, with the concept that we all have some degree of narcissism and that it is the point and the path of our life journey to transcend our focus on our selves. Mainstream psychology stops its focus on trait narcissism with a healthy level of self esteem, but it is at this point that Transpersonal Psychology can complement the discussion. From a Transpersonal point of view, failure to begin the process of self-transcendence will also create emotional symptoms of meaning or purpose, often later in our lives, and is also conceived of as a developmental arrest. The maps for this transcendence are hidden in plain sight; in the chakras of kundalini yoga, in the sacraments of the Catholic Church, in the Kabbalah tree of life of Judaism, in Maslow’s hierarchy of needs, to name a few. This paper outlines some proposed research exploring the use of daily practices that can be incorporated into the therapy room; practices that utilise meditation, visualisation and imagination: that are informed by spiritual technology and guided by the psychodynamic theory of Self Psychology.

Keywords: narcissism, self-psychology, self-practice, self-transcendence

Procedia PDF Downloads 240
3386 Moving from Practice to Theory

Authors: Maria Lina Garrido

Abstract:

This paper aims to reflect upon instruction in English classes with the specific purpose of reading comprehension development, having as its paradigm the considerations presented by William Grabe, in his book Reading in a Second Language: Moving from theory to practice. His concerns regarding the connection between research findings and instructional practices have stimulated the present author to re-evaluate both her long practice as an English reading teacher and as the author of two reading textbooks for graduate students. Elements of the reading process such as linguistic issues, prior knowledge, reading strategies, critical evaluation, and motivation are the main foci of this analysis as far as the activities developed in the classroom are concerned. The experience with university candidates on postgraduate courses with different levels of English knowledge in Bahia, Brazil, has definitely demanded certain adjustments to this author`s classroom setting. Word recognition based on cognates, for example, has been emphasized given the fact that academic texts use many Latin words which have the same roots as the Brazilian Portuguese lexicon. Concerning syntactic parsing, the tenses/verbal aspects, modality and linking words are included in the curriculum, but not with the same depth as the general English curricula. Reading strategies, another essential predictor for developing reading skills, have been largely stimulated in L2 classes in order to compensate for a lack of the appropriate knowledge of the foreign language. This paper presents results that demonstrate that this author`s teaching practice is compatible with the implications and instruction concerning the reading process outlined by Grabe, however, it admits that each class demands specific instructions to meet the needs of that particular group.

Keywords: classroom practice, instructional activities, reading comprehension, reading skills

Procedia PDF Downloads 431
3385 Carbonation of Wollastonite (001) competing Hydration: Microscopic Insights from Ion Spectroscopy and Density Functional Theory

Authors: Peter Thissen

Abstract:

In this work, we report about the influence of the chemical potential of water on the carbonation reaction of wollastonite (CaSiO3) as model surface of cement and concrete. Total energy calculations based on density functional theory (DFT) combined with kinetic barrier predictions based on nudge elastic band (NEB) method show that the exposure of the water-free wollastonite surface to CO2 results in a barrier-less carbonation. CO2 reacts with the surface oxygen and forms carbonate (CO32-) complexes together with a major reconstruction of the surface. The reaction comes to a standstill after one carbonate monolayer has been formed. In case one water monolayer is covering the wollastonite surface, the carbonation is no more barrier-less, yet ending in a localized monolayer. Covered with multilayers of water, the thermodynamic ground state of the wollastonite completely changes due to a metal-proton exchange reaction (MPER, also called early stage hydration) and Ca2+ ions are partially removed from solid phase into the H2O/wollastonite interface. Mobile Ca2+ react again with CO2 and form carbonate complexes, ending in a delocalized layer. By means of high resolution time-of-flight secondary-ion mass-spectroscopy images (ToF-SIMS), we confirm that hydration can lead to a partially delocalization of Ca2+ ions on wollastonite surfaces. Finally, we evaluate the impact of our model surface results by means of Low Energy Ion Scattering (LEIS) spectroscopy combined with careful discussion about the competing reactions of carbonation vs. hydration.

Keywords: Calcium-silicate, carbonation, hydration, metal-proton exchange reaction

Procedia PDF Downloads 341
3384 Intensive Intercultural English Language Pedagogy among Parents from Culturally and Linguistically Diverse Backgrounds (CALD)

Authors: Ann Dashwood

Abstract:

Using Standard Australian English with confidence is a cultural expectation of parents of primary school aged children who want to engage effectively with their children’s teachers and school administration. That confidence in support of their children’s learning at school is seldom experienced by parents whose first language is not English. Sharing language with competence in an intercultural environment is the common denominator for meaningful communication and engagement to occur in a school community. Experience in relevant, interactive sessions is known to enhance engagement and participation. The purpose of this paper is to identify a pedagogy for parents otherwise isolated from daily use of functional Australian cultural language learned to engage effectively in their children’s learning at school. The outcomes measure parents’ intercultural engagement with classroom teachers and attention to the school’s administrative procedures using quantitative and qualitative methods. A principled communicative task-based language learning approach, combined with intercultural communication strategies provide the theoretical base for intensive English inquiry-based learning and engagement. The quantitative analysis examines data samples collected by classroom teachers and administrators and parents’ writing samples. Interviews and observations qualitatively inform the study. Currently, significant numbers of projects are active in community centers and schools to enhance English language knowledge of parents from Language Backgrounds Other Than English (LBOTE). The study is significant to explore the effects of an intensive English pedagogy with parents of varied English language backgrounds, by targeting inquiry-based language use for social interactions in the school and wider community, specific engagement and cultural interaction with teachers and school activities and procedures.

Keywords: engagement, intercultural communication, language teaching pedagogy, LBOTE, school community

Procedia PDF Downloads 100
3383 Determining the Factors Affecting Social Media Addiction (Virtual Tolerance, Virtual Communication), Phubbing, and Perception of Addiction in Nurses

Authors: Fatima Zehra Allahverdi, Nukhet Bayer

Abstract:

Objective: Three questions were formulated to examine stressful working units (intensive care units, emergency unit nurses) utilizing the self-perception theory and social support theory. This study provides a distinctive input by inspecting the combination of variables regarding stressful working environments. Method: The descriptive research was conducted with the participation of 400 nurses working at Ankara City Hospital. The study used Multivariate Analysis of Variance (MANOVA), regression analysis, and a mediation model. Hypothesis one used MANOVA followed by a Scheffe post hoc test. Hypothesis two utilized regression analysis using a hierarchical linear regression model. Hypothesis three used a mediation model. Result: The study utilized mediation analyses. Findings supported the hypotheses that intensive care units have significantly high scores in virtual communication and virtual tolerance. The number of years on the job, virtual communication, virtual tolerance, and phubbing significantly predicted 51% of the variance of perception of addiction. Interestingly, the number of years on the job, while significant, was negatively related to perception of addiction. Conclusion: The reasoning behind these findings and the lack of significance in the emergency unit is discussed. Around 7% of the variance of phubbing was accounted for through working in intensive care units. The model accounted for 26.80 % of the differences in the perception of addiction.

Keywords: phubbing, social media, working units, years on the job, stress

Procedia PDF Downloads 30
3382 The Discussion on the Composition of Feng Shui by the Environmental Planning Viewpoint

Authors: Jhuang Jin-Jhong, Hsieh Wei-Fan

Abstract:

Climate change causes natural disasters persistently. Therefore, nowadays environmental planning objective tends to the issues of respecting nature and coexisting with nature. As a result, the natural environment analysis, e.g., the analysis of topography, soil, hydrology, climate, vegetation, is highly emphasized. On the other hand, Feng Shui has been a criterion of site selection for residence in Eastern since the ancient times and has had farther influence on site selection for castles and even for temples and tombs. The primary criterion of site selection is judging the quality of Long: mountain range, Sha: nearby mountains, Shui: hydrology, Xue: foundation, Xiang: aspect, which are similar to the environmental variables of mountain range, topography, hydrology and aspect. For the reason, a lot researchers attempt to probe into the connection between the criterion of Feng Shui and environmental planning factors. Most researches only discussed with the composition and theory of space of Feng Shui, but there is no research which explained Feng Shui through the environmental field. Consequently, this study reviewed the theory of Feng Shui through the environmental planning viewpoint and assembled essential composition factors of Feng Shui. The results of this study point. From literature review and comparison of theoretical meanings, we find that the ideal principles for planning the Feng Shui environment can also be used for environmental planning. Therefore, this article uses 12 ideal environmental features used in Feng Shui to contrast the natural aspects of the environment and make comparisons with previous research and classifies the environmental factors into climate, topography, hydrology, vegetation, and soil.

Keywords: the composition of Feng Shui, environmental planning, site selection, main components of the Feng Shui environment

Procedia PDF Downloads 488
3381 A Constructivist Grounded Theory Study on the Impact of Automation on People and Gardening

Authors: Hamilton V. Niculescu

Abstract:

Following a three year study conducted on eighteen Irish people that are involved in growing vegetables in various community gardens around Dublin, Republic of Ireland, it was revealed that addition of some automated features aimed at improving agricultural practices represented a process which was regarded as potentially beneficial, and as a great tool to closely monitor climate conditions inside the greenhouses. The participants were provided with a free custom-built mobile app through which they could remotely monitor and control features such as irrigation, air ventilation, and windows to ensure optimal growing conditions for vegetables growing inside purpose-built greenhouses. While the initial interest was generally high, within weeks, the participants' level of interaction with the enclosures slowly declined. By employing a constructivist grounded theory methodology, following focus group discussions, in-depth semi-structured interviews, and observations, it was revealed that participants' trust in newer technologies, and renewables, in particular, was low. There are various reasons for this, but because the participants in this study consist of mainly working-class people, it can be argued that lack of education and knowledge are the main barriers acting against the adoption of innovations. Consequently, it was revealed that most participants eventually decided to "set and forget" the systems in automatic working mode, indicating that the immediate effect of introducing people to assisting technologies also introduced some unintended consequences into their lifestyle. It is argued that this occurrence also indicates the fact that people initially "read" newer technologies and only adopt those features that they find useful and less intrusive in regards to their current lifestyle.

Keywords: automation, communication, greenhouse, sustainable

Procedia PDF Downloads 103
3380 Health Care using Queuing Theory

Authors: S. Vadivukkarasi, K. Karthi, M. Karthick, C. Dinesh, S. Santhosh, A. Yogaraj

Abstract:

The appointment system was designed to minimize patient’s idle time overlooking patients waiting time in hospitals. This is no longer valid in today’s consumer oriented society. Long waiting times for treatment in the outpatient department followed by short consultations has long been a complaint. Nowadays, customers use waiting time as a decisive factor in choosing a service provider. Queuing theory constitutes a very powerful tool because queuing models require relatively little data and are simple and fast to use. Because of this simplicity and speed, modelers can be used to quickly evaluate and compare various alternatives for providing service. The application of queuing models in the analysis of health care systems is increasingly accepted by health care decision makers. Timely access to care is a key component of high-quality health care. However, patient delays are prevalent throughout health care systems, resulting in dissatisfaction and adverse clinical consequences for patients as well as potentially higher costs and wasted capacity for providers. Arguably, the most critical delays for health care are the ones associated with health care emergencies. The allocation of resources can be divided into three general areas: bed management, staff management, and room facility management. Effective and efficient patient flow is indicated by high patient throughput, low patient waiting times, a short length of stay at the hospital and overtime, while simultaneously maintaining adequate staff utilization rates and low patient’s idle times.

Keywords: appointment system, patient scheduling, bed management, queueing calculation, system analysis

Procedia PDF Downloads 280
3379 An Analysis of the Impact of Government Budget Deficits on Economic Performance. A Zimbabwean Perspective

Authors: Tafadzwa Shumba, Rose C. Nyatondo, Regret Sunge

Abstract:

This research analyses the impact of budget deficits on the economic performance of Zimbabwe. The study employs the autoregressive distributed lag (ARDL) confines testing method to co-integration and long-run estimation using time series data from 1980-2018. The Augmented Dick Fuller (ADF) and the Granger approach were used to testing for stationarity and causality among the factors. Co-integration test results affirm a long term association between GDP development rate and descriptive factors. Causality test results show a unidirectional connection between budget shortfall to GDP development and bi-directional causality amid debt and budget deficit. This study also found unidirectional causality from debt to GDP growth rate. ARDL estimates indicate a significantly positive long term and significantly negative short term impact of budget shortfall on GDP. This suggests that budget deficits have a short-run growth retarding effect and a long-run growth-inducing effect. The long-run results follow the Keynesian theory that posits that fiscal deficits result in an increase in GDP growth. Short-run outcomes follow the neoclassical theory. In light of these findings, the government is recommended to minimize financing of recurrent expenditure using a budget deficit. To achieve sustainable growth and development, the government needs to spend an absorbable budget deficit focusing on capital projects such as the development of human capital and infrastructure.

Keywords: ARDL, budget deficit, economic performance, long run

Procedia PDF Downloads 66
3378 Legacy of Colonialism in Canada’s Immigration Policy: Experiences of Skilled, Racialized Immigrants in the Canadian Labour Market

Authors: Karun K. Karki

Abstract:

Globalization has intensified the transnational movement of people, mainly from the Global South to the Global North. In this context of transnationalism, migration is framed within the national interests required for economic prosperity. More specifically, the competition for the ‘best and the brightest of highly educated immigrants from around the world can be perceived as evidence that countries in the North are competing in the knowledge-based global economy. Canada is not an exception. Since the early 1970s, Canada has successfully admitted, on average, 200,000 to 280,000 immigrants annually for permanent residency, primarily for economic development, family reunification and humanitarian affairs. Among these three components, economic class immigrants are the highest priority in its immigration policy. Although Canada admits highly qualified immigrant professionals with the expectation of easily integrating them, many highly skilled immigrants are marginalized in the labour market due to a myriad of layered structural and institutional barriers that prevent them from working in the professions for which they were trained in their country of origin. More than 67% of highly skilled immigrants are more likely to be in jobs for which they are formally overqualified. The deteriorating employment situation of highly educated immigrants, particularly the immigrants of racialized groups, needs analytical scrutiny of the immigration policy of Canada. In this paper, author examine how the historical legacy of colonialism still continues in Canada’s immigration policymaking and how this legacy has impacted developing countries in the global South. Author argue that the Canadian immigration policy is based on the notion of exploiting/dominating smaller countries and immigrants from these countries. Such colonial policies have systematically ‘Othered’ immigrants based on their race, ethnicity, gender, culture, and linguistic characteristics. Recommendations are made to revisit contemporary immigration and settlement policies to effectively integrate immigrants into Canadian society.

Keywords: colonialism, Canadian immigration policy, racialized immigrants, skilled immigrants

Procedia PDF Downloads 31
3377 Theoretical Analysis of the Existing Sheet Thickness in the Calendering of Pseudoplastic Material

Authors: Muhammad Zahid

Abstract:

The mechanical process of smoothing and compressing a molten material by passing it through a number of pairs of heated rolls in order to produce a sheet of desired thickness is called calendering. The rolls that are in combination are called calenders, a term derived from kylindros the Greek word for the cylinder. It infects the finishing process used on cloth, paper, textiles, leather cloth, or plastic film and so on. It is a mechanism which is used to strengthen surface properties, minimize sheet thickness, and yield special effects such as a glaze or polish. It has a wide variety of applications in industries in the manufacturing of textile fabrics, coated fabrics, and plastic sheeting to provide the desired surface finish and texture. An analysis has been presented for the calendering of Pseudoplastic material. The lubrication approximation theory (LAT) has been used to simplify the equations of motion. For the investigation of the nature of the steady solutions that exist, we make use of the combination of exact solution and numerical methods. The expressions for the velocity profile, rate of volumetric flow and pressure gradient are found in the form of exact solutions. Furthermore, the quantities of interest by engineering point of view, such as pressure distribution, roll-separating force, and power transmitted to the fluid by the rolls are also computed. Some results are shown graphically while others are given in the tabulated form. It is found that the non-Newtonian parameter and Reynolds number serve as the controlling parameters for the calendering process.

Keywords: calendering, exact solutions, lubrication approximation theory, numerical solutions, pseudoplastic material

Procedia PDF Downloads 121
3376 An Interactive Institutional Framework for Evolution of Enterprise Technological Innovation Capabilities System: A Complex Adaptive Systems Approach

Authors: Sohail Ahmed, Ke Xing

Abstract:

This research theoretically explored the evolution mechanism of enterprise technological innovation capability system (ETICS) from the perspective of complex adaptive systems (CAS). This research proposed an analytical framework for ETICS, its concepts, and theory by integrating CAS methodology into the management of the technological innovation capability of enterprises and discusses how to use the principles of complexity to analyze the composition, evolution, and realization of the technological innovation capabilities in complex dynamic environments. This paper introduces the concept and interaction of multi-agent, the theoretical background of CAS, and summarizes the sources of technological innovation, the elements of each subject, and the main clusters of adaptive interactions and innovation activities. The concept of multi-agents is applied through the linkages of enterprises, research institutions, and government agencies with the leading enterprises in industrial settings. The study was exploratory and based on CAS theory. Theoretical model is built by considering technological and innovation literature from foundational to state of the art projects of technological enterprises. On this basis, the theoretical model is developed to measure the evolution mechanism of the enterprise's technological innovation capability system. This paper concludes that the main characteristics for evolution in technological systems are based on the enterprise’s research and development personnel, investments in technological processes, and innovation resources are responsible for the evolution of enterprise technological innovation performance. The research specifically enriched the application process of technological innovation in institutional networks related to enterprises.

Keywords: complex adaptive system, echo model, enterprise technological innovation capability system, research institutions, multi-agents

Procedia PDF Downloads 111
3375 Financial Liberalization, Exchange Rates and Demand for Money in Developing Economies: The Case of Nigeria, Ghana and Gambia

Authors: John Adebayo Oloyhede

Abstract:

This paper examines effect of financial liberalization on the stability of the demand for money function and its implication for exchange rate behaviour of three African countries. As the demand for money function is regarded as one of the two main building blocks of most exchange rate determination models, the other being purchasing power parity, its stability is required for the monetary models of exchange rate determination to hold. To what extent has the liberalisation policy of these countries, for instance liberalised interest rate, affected the demand for money function and what has been the consequence on the validity and relevance of floating exchange rate models? The study adopts the Autoregressive Instrumental Package (AIV) of multiple regression technique and followed the Almon Polynomial procedure with zero-end constraint. Data for the period 1986 to 2011 were drawn from three developing countries of Africa, namely: Gambia, Ghana and Nigeria, which did not only start the liberalization and floating system almost at the same period but share similar and diverse economic and financial structures. Its findings show that the demand for money was a stable function of income and interest rate at home and abroad. Other factors such as exchange rate and foreign interest rate exerted some significant effect on domestic money demand. The short-run and long-run elasticity with respect to income, interest rates, expected inflation rate and exchange rate expectation are not greater than zero. This evidence conforms to some extent to the expected behaviour of the domestic money function and underscores its ability to serve as good building block or assumption of the monetary model of exchange rate determination. This will, therefore, assist appropriate monetary authorities in the design and implementation of further financial liberalization policy packages in developing countries.

Keywords: financial liberalisation, exchange rates, demand for money, developing economies

Procedia PDF Downloads 352
3374 Laboratory-Based Monitoring of Hepatitis B Virus Vaccination Status in North Central Nigeria

Authors: Nwadioha Samuel Iheanacho, Abah Paul, Odimayo Simidele Michael

Abstract:

Background: The World Health Assembly through the Global Health Sector Strategy on viral hepatitis calls for the elimination of viral hepatitis as a public health threat by 2030. All hands are on deck to actualize this goal through an effective and active vaccination and monitoring tool. Aim: To combine the Epidemiologic with Laboratory Hepatitis B Virus vaccination monitoring tools. Method: Laboratory results analysis of subjects recruited during the World Hepatitis week from July 2020 to July 2021 was done after obtaining their epidemiologic data on Hepatitis B virus risk factors, in the Medical Microbiology Laboratory of Benue State University Teaching Hospital, Nigeria. Result: A total of 500 subjects comprising males 60.0%(n=300/500) and females 40.0%(n=200/500) were recruited. A fifty-three percent majority was of the age range of 26 to 36 years. Serologic profiles were as follows, 15.0%(n=75/500) HBsAg; 7.0% (n=35/500) HBeAg; 8.0% (n=40/500) Anti-Hbe; 20.0% (n=100/500) Anti-HBc and 38.0% (n=190/500) Anti-HBs. Immune responses to vaccination were as follows, 47.0%(n=235/500) Immune naïve {no serologic marker + normal ALT}; 33%(n=165/500) Immunity by vaccination {Anti-HBs + normal ALT}; 5%(n=25/500) Immunity to previous infection {Anti-HBs, Anti-HBc, +/- Anti-HBe + normal ALT}; 8%(n=40/500) Carriers {HBsAg, Anti-HBc, Anti-HBe +normal ALT} and 7% (35/500) Anti-HBe serum- negative infections {HBsAg, HBeAg, Anti-HBc +elevated ALT}. Conclusion: The present 33.0% immunity by vaccination coverage in Central Nigeria was much lower than the 41.0% national peak in 2013, and a far cry from the global expectation of attainment of a Global Health Sector Strategy on the elimination of viral hepatitis as a public health threat by 2030. Therefore, more creative ideas and collective effort are needed to attain this goal of the World Health Assembly.

Keywords: Hepatitis B, vaccination status, laboratory tools, resource-limited settings

Procedia PDF Downloads 50
3373 Application of Metaverse Service to Construct Nursing Education Theory and Platform in the Post-pandemic Era

Authors: Chen-Jung Chen, Yi-Chang Chen

Abstract:

While traditional virtual reality and augmented reality only allow for small movement learning and cannot provide a truly immersive teaching experience to give it the illusion of movement, the new technology of both content creation and immersive interactive simulation of the metaverse can just reach infinite close to the natural teaching situation. However, the mixed reality virtual classroom of metaverse has not yet explored its theory, and it is rarely implemented in the situational simulation teaching of nursing education. Therefore, in the first year, the study will intend to use grounded theory and case study methods and in-depth interviews with nursing education and information experts. Analyze the interview data to investigate the uniqueness of metaverse development. The proposed analysis will lead to alternative theories and methods for the development of nursing education. In the second year, it will plan to integrate the metaverse virtual situation simulation technology into the alternate teaching strategy in the pediatric nursing technology course and explore the nursing students' use of this teaching method as the construction of personal technology and experience. By leveraging the unique features of distinct teaching platforms and developing processes to deliver alternative teaching strategies in a nursing technology teaching environment. The aim is to increase learning achievements without compromising teaching quality and teacher-student relationships in the post-pandemic era. A descriptive and convergent mixed methods design will be employed. Sixty third-grade nursing students will be recruited to participate in the research and complete the pre-test. The students in the experimental group (N=30) agreed to participate in 4 real-time mixed virtual situation simulation courses in self-practice after class and conducted qualitative interviews after each 2 virtual situation courses; the control group (N=30) adopted traditional practice methods of self-learning after class. Both groups of students took a post-test after the course. Data analysis will adopt descriptive statistics, paired t-tests, one-way analysis of variance, and qualitative content analysis. This study addresses key issues in the virtual reality environment for teaching and learning within the metaverse, providing valuable lessons and insights for enhancing the quality of education. The findings of this study are expected to contribute useful information for the future development of digital teaching and learning in nursing and other practice-based disciplines.

Keywords: metaverse, post-pandemic era, online virtual classroom, immersive teaching

Procedia PDF Downloads 39
3372 Organizational Culture and Its Internalization of Change in the Manufacturing and Service Sector Industries in India

Authors: Rashmi Uchil, A. H. Sequeira

Abstract:

Post-liberalization era in India has seen an unprecedented growth of mergers, both domestic as well as cross-border deals. Indian organizations have slowly begun appreciating this inorganic method of growth. However, all is not well as is evidenced in the lowering value creation of organizations after mergers. Several studies have identified that organizational culture is one of the key factors that affects the success of mergers. But very few studies have been attempted in this realm in India. The current study attempts to identify the factors in the organizational culture variable that may be unique to India. It also focuses on the difference in the impact of organizational culture on merger of organizations in the manufacturing and service sectors in India. The study uses a mixed research approach. An exploratory research approach is adopted to identify the variables that constitute organizational culture specifically in the Indian scenario. A few hypotheses were developed from the identified variables and tested to arrive at the Grounded Theory. The Grounded Theory approach used in the study, attempts to integrate the variables related to organizational culture. Descriptive approach is used to validate the developed grounded theory with a new empirical data set and thus test the relationship between the organizational culture variables and the success of mergers. Empirical data is captured from merged organizations situated in major cities of India. These organizations represent significant proportions of the total number of organizations which have adopted mergers. The mix of industries included software, banking, manufacturing, pharmaceutical and financial services. Mixed sampling approach was adopted for this study. The first phase of sampling was conducted using the probability method of stratified random sampling. The study further used the non-probability method of judgmental sampling. Adequate sample size was identified for the study which represents the top, middle and junior management levels of the organizations that had adopted mergers. Validity and reliability of the research instrument was ensured with appropriate tests. Statistical tools like regression analysis, correlation analysis and factor analysis were used for data analysis. The results of the study revealed a strong relationship between organizational culture and its impact on the success of mergers. The study also revealed that the results were unique to the extent that they highlighted a marked difference in the manner of internalization of change of organizational culture after merger by the organizations in the manufacturing sector. Further, the study reveals that the organizations in the service sector internalized the changes at a slower rate. The study also portrays the industries in the manufacturing sector as more proactive and can contribute to a change in the perception of the said organizations.

Keywords: manufacturing industries, mergers, organizational culture, service industries

Procedia PDF Downloads 274
3371 Social Media Diffusion And Implications For Opinion Leadership In Northcentral Nigeria

Authors: Chuks Odiegwu-Enwerem

Abstract:

The classical notion of opinion leadership presupposes that the media is at the center of an effective and successful opinion leadership. Under this idea, an opinion leader is an active media user who consumes, understands, digests and interprets the messages for the understanding and acceptance/adoption by lower-end media users – whose access and understanding of media content are supposedly low. Because of their unique access to and presumed understanding of media functions and their content, opinion leaders are typically esteemed by those who look forward to and accept their opinions. Lazarsfeld and Katz’s two-step flow of communication theory is the basis of opinion leadership – propelled by limited access to the media. With the emergence and spread of social media and its unlimited access by all and sundry, however, the study interrogates the relevance and application of opinion leaders and, by implication, the two-step flow communication theory in Nigeria’s Northcentral region. It seeks to determine whether opinion leaders still exist in the picture and if they still exert considerable influence, especially in matters of political conversations and decision-making among the citizens of this area. It further explores whether the diffusion of social media is a reality and how the ‘low-end’ media users react to the new-found freedom of access to media, and how they are using it to inform their decisions on important matters as well as examines if they are still glued to their opinion leaders. This study explores the empirical dimensions of the two-step flow hypothesis in relation to the activities of social media to determine if a change has occurred and in what direction, using mixed methos of Survey and in-depth interviews. Our understanding and belief in some theoretical assumptions may be enhanced or challenged by the study outcome.

Keywords: Opinion Leadership, Active Media User, Two-Step-Flow, Social media, Northcentral Nigeria

Procedia PDF Downloads 49
3370 Appropriation of Cryptocurrencies as a Payment Method by South African Retailers

Authors: Neliswa Dyosi

Abstract:

Purpose - Using an integrated Technology-Organization-Environment (TOE) framework and the model of technology appropriation (MTA) as a theoretical lens, this interpretive qualitative study seeks to understand and explain the factors that influence the appropriation, non-appropriation, and disappropriation of bitcoin as a payment method by South African retailers. Design/methodology/approach –The study adopts the interpretivist philosophical paradigm. Multiple case studies will be adopted as a research strategy. For data collection, the study follows a qualitative approach. Qualitative data will be collected from the six retailers in various industries. Semi-structured interviews and documents will be used as the data collection techniques. Purposive and snowballing sampling techniques will be used to identify participants within the organizations. Data will be analyzed using thematic analysis. Originality/value - Using the deduction approach, the study seeks to provide a descriptive and explanatory contribution to theory. The study contributes to theory development by integrating the MTA and TOE frameworks as a means to understand technology adoption behaviors of organizations, in this case, retailers. This is also the first study that looks at an integrated approach of the Technology-Organization-Environment (TOE) framework and the MTA framework to understand the adoption and use of a payment method. South Africa is ranked amongst the top ten countries in the world on cryptocurrency adoption. There is, however, still a dearth of literature on the current state of adoption and usage of bitcoin as a payment method in South Africa. The study will contribute to the existing literature as bitcoin cryptocurrency is gaining popularity as an alternative payment method across the globe.

Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriation

Procedia PDF Downloads 113