Search results for: option pricing theory
3158 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm
Authors: Majid Pourahmadi
Abstract:
The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)
Procedia PDF Downloads 3373157 Thulium Laser Vaporisation and Enucleation of Prostate in Patients on Anticoagulants and Antiplatelet Agents
Authors: Abdul Fatah, Naveenchandra Acharya, Vamshi Krishna, T. Shivaprasad, Ramesh Ramayya
Abstract:
Background: Significant number of patients with bladder outlet obstruction due to BPH are on anti-platelets and anticoagulants. Prostate surgery in this group of patients either in the form of TURP or Open prostatectomy is associated with increased risk of bleeding complications requiring transfusions, packing of the prostatic fossa or ligation or embolization of internal iliac arteries. Withholding of antiplatelets and anticoagulants may be associated with cardiac and other complications. Efficacy of Thulium Laser in the above group of patients was evaluated in terms of peri-operative, postoperative and delayed bleeding complications as well as cardiac events in peri-operative and immediate postoperative period. Methods: 217 patients with a mean age of 68.8 years were enrolled between March 2009 and March 2013 (36 months), and treated for BPH with ThuLEP. Every patient was evaluated at base line according to: Digital Rectal Examination (DRE), prostate volume, Post-Voided volume (PVR), International Prostate Symptoms Score (I-PSS), PSA values, urine analysis and urine culture, uroflowmetry. The post operative complications in the form of drop in hemoglobin level, transfusion rates, post –operative cardiac events within a period of 30 days, delayed hematuria and events like deep vein thrombosis and pulmonary embolism were noted. Results: Our data showed a better post-operative outcome in terms of, postoperative bleeding requiring intervention 7 (3.2%), transfusion rate 4 (1.8%) and cardiac events within a period of 30 days 4(1.8%), delayed hematuria within 6 months 2(0.9 %) compared other series of prostatectomies. Conclusion: The thulium LASER prostatectomy is a safe and effective option for patients with cardiac comorbidties and those patients who are on antiplatelet agents and anticoagulants. The complication rate is less as compared to larger series reported with open and transurethral prostatectomies.Keywords: thulium laser, prostatectomy, antiplatelet agents, bleeding
Procedia PDF Downloads 3933156 Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses
Authors: Nabil Sultan
Abstract:
A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).Keywords: MOOCs, disruptive innovations, higher education, jobs theory
Procedia PDF Downloads 2703155 Budget Impact Analysis of a Stratified Treatment Cascade for Hepatitis C Direct Acting Antiviral Treatment in an Asian Middle-Income Country through the Use of Compulsory and Voluntary Licensing Options
Authors: Amirah Azzeri, Fatiha H. Shabaruddin, Scott A. McDonald, Rosmawati Mohamed, Maznah Dahlui
Abstract:
Objective: A scaled-up treatment cascade with direct-acting antiviral (DAA) therapy is necessary to achieve global WHO targets for hepatitis C virus (HCV) elimination in Malaysia. Recently, limited access to Sofosbuvir/Daclatasvir (SOF/DAC) is available through compulsory licensing, with future access to Sofosbuvir/Velpatasvir (SOF/VEL) expected through voluntary licensing due to recent agreements. SOF/VEL has superior clinical outcomes, particularly for cirrhotic stages, but has higher drug acquisition costs compared to SOF/DAC. It has been proposed that a stratified treatment cascade might be the most cost-efficient approach for Malaysia whereby all HCV patients are treated with SOF/DAC except for patients with cirrhosis who are treated with SOF/VEL. This study aimed to conduct a five-year budget impact analysis from the provider perspective of the proposed stratified treatment cascade for HCV treatment in Malaysia. Method: A disease progression model that was developed based on model-predicted HCV epidemiology data in Malaysia was used for the analysis, where all HCV patients in scenario A were treated with SOF/DAC for all disease stages while in scenario B, SOF/DAC was used only for non-cirrhotic patients and SOF/VEL was used for the cirrhotic patients. The model projections estimated the annual numbers of patients in care and the numbers of patients to be initiated on DAA treatment nationally. Healthcare costs associated with DAA therapy and disease stage monitoring was included to estimate the downstream cost implications. For scenario B, the estimated treatment uptake of SOF/VEL for cirrhotic patients were 25%, 50%, 75%, 100% and 100% for 2018, 2019, 2020, 2021 and 2022 respectively. Healthcare costs were estimated based on standard clinical pathways for DAA treatment described in recent guidelines. All costs were reported in US dollars (conversion rate US$1=RM4.09, the price year 2018). Scenario analysis was conducted for 5% and 10% reduction of SOF/VEL acquisition cost anticipated from the competitive market pricing of generic DAA in Malaysia. Results: The stratified treatment cascade with SOF/VEL in Scenario B was found to be cost-saving compared to Scenario A. A substantial portion of the cost reduction was due to the costs associated with DAA therapy which resulted in USD 40 thousand (year 1) to USD 443 thousand (year 5) savings annually, with cumulative savings of USD 1.1 million after 5 years. Cost reductions for disease stage monitoring were seen in year three onwards which resulted in cumulative savings of USD 1.1 thousand. Scenario analysis estimated cumulative savings of USD 1.24 to USD 1.35 million when the acquisition cost of SOF/VEL was reduced. Conclusion: A stratified treatment cascade with SOF/VEL was expected to be cost-saving and can results in a budget impact reduction in overall healthcare expenditure in Malaysia compared to treatment with SOF/DAC. The better clinical efficacy with SOF/VEL is expected to halt patients’ HCV disease progression and may reduce downstream costs of treating advanced disease stages. The findings of this analysis may be useful to inform healthcare policies for HCV treatment in Malaysia.Keywords: Malaysia, direct acting antiviral, compulsory licensing, voluntary licensing
Procedia PDF Downloads 1643154 Use of Hierarchical Temporal Memory Algorithm in Heart Attack Detection
Authors: Tesnim Charrad, Kaouther Nouira, Ahmed Ferchichi
Abstract:
In order to reduce the number of deaths due to heart problems, we propose the use of Hierarchical Temporal Memory Algorithm (HTM) which is a real time anomaly detection algorithm. HTM is a cortical learning algorithm based on neocortex used for anomaly detection. In other words, it is based on a conceptual theory of how the human brain can work. It is powerful in predicting unusual patterns, anomaly detection and classification. In this paper, HTM have been implemented and tested on ECG datasets in order to detect cardiac anomalies. Experiments showed good performance in terms of specificity, sensitivity and execution time.Keywords: cardiac anomalies, ECG, HTM, real time anomaly detection
Procedia PDF Downloads 2283153 Spectral Efficiency Improvement in 5G Systems by Polyphase Decomposition
Authors: Wilson Enríquez, Daniel Cardenas
Abstract:
This article proposes a filter bank format combined with the mathematical tool called polyphase decomposition and the discrete Fourier transform (DFT) with the purpose of improving the performance of the fifth-generation communication systems (5G). We started with a review of the literature and the study of the filter bank theory and its combination with DFT in order to improve the performance of wireless communications since it reduces the computational complexity of these communication systems. With the proposed technique, several experiments were carried out in order to evaluate the structures in 5G systems. Finally, the results are presented in graphical form in terms of bit error rate against the ratio bit energy/noise power spectral density (BER vs. Eb / No).Keywords: multi-carrier system (5G), filter bank, polyphase decomposition, FIR equalizer
Procedia PDF Downloads 2013152 Decision Making under Strict Uncertainty: Case Study in Sewer Network Planning
Authors: Zhen Wu, David Lupien St-Pierre, Georges Abdul-Nour
Abstract:
In decision making under strict uncertainty, decision makers have to choose a decision without any information about the states of nature. The classic criteria of Laplace, Wald, Savage, Hurwicz and Starr are introduced and compared in a case study of sewer network planning. Furthermore, results from different criteria are discussed and analyzed. Moreover, this paper discusses the idea that decision making under strict uncertainty (DMUSU) can be viewed as a two-player game and thus be solved by a solution concept in game theory: Nash equilibrium.Keywords: decision criteria, decision making, sewer network planning, decision making, strict uncertainty
Procedia PDF Downloads 5603151 The Psychological Effects of Nature on Subjective Well-Being: An Experimental Approach
Authors: Tatjana Kochetkova
Abstract:
This paper explores the pivotal role of environmental education, specifically outdoor education, in facilitating a psychological connection to nature among young adults. This research aims to contribute to building an empirical and conceptual basis of ecopsychology by providing a picture of psyche-nature interaction. It presents the results of the four-day connection-to-nature workshop. It intends to find out the effects of the awareness of nature on subjective well-being and perception of the meaning of life. This led to finding a battery-recharging effect of nature and the influence of nature at four levels of awareness: external physical perception, internal (bodily) sensation, emotions, and existential meaning. The research on the psychological bond of humans with the natural environment, the subject of ecopsychology, is still in its infancy. However, despite several courageous and fruitful attempts, there are still no direct answers to the fundamental questions about the way in which the natural environment influences humans and the specific role of nature in the human psyche. The urge to address this question was the primary reason for the current experiment. The methodology of this study was taken from the study of Patterson, and from White and Hendee. The methodology included a series of assignments on the perception of nature (the exercises are described in the attachment). Experiences were noted in a personal diary, which we used later for analysis. There are many trustworthy claims that contact with nature has positive effects on human subjective well-being and that it is of essential psychological and spiritual value. But, there is a need for more support and theoretical explanation for this phenomenon. As a contribution to filling these gaps, this qualitative study was conducted. The aim of this study is to explore the psychological effects of short-term awareness of wilderness on one’s subjective well-being and on one’s sense of the meaning of life. This specific study is based on the more general hypothesis that there are positive relationships between the experience of wilderness and the development of the self, feelings of community, and spiritual development. It restricted the study of the psychological effects of short term stay in nature to two variables (subjective well-being and the sense of meaning of life). The study aimed at (i) testing the hypothesis that there are positive effects of the awareness of wilderness on the subjective sense of well-being and meaning in life, (ii) understanding the nature of the psychological need for wilderness. Although there is a substantial amount of data on the psychological benefits of nature, we still lack a theory that explains the findings. The present research aims to contribute to such a theory. This is an experiment aimed specifically at the effects of nature on the sense of well-being and meaning in life.Keywords: environmental education, psychological connection to nature, subjective well-being, symbolic meaning of nature, emotional reaction to nature, meaning of life
Procedia PDF Downloads 723150 Effect of Windrow Management on Ammonia and Nitrous Oxide Emissions from Swine Manure Composting
Authors: Nanh Lovanh, John Loughrin, Kimberly Cook, Phil Silva, Byung-Taek Oh
Abstract:
In the era of sustainability, utilization of livestock wastes as soil amendment to provide micronutrients for crops is very economical and sustainable. It is well understood that livestock wastes are comparable, if not better, nutrient sources for crops as chemical fertilizers. However, the large concentrated volumes of animal manure produced from livestock operations and the limited amount of available nearby agricultural land areas necessitated the need for volume reduction of these animal wastes. Composting of these animal manures is a viable option for biomass and pathogenic reduction in the environment. Nevertheless, composting also increases the potential loss of available nutrients for crop production as well as unwanted emission of anthropogenic air pollutants due to the loss of ammonia and other compounds via volatilization. In this study, we examine the emission of ammonia and nitrous oxide from swine manure windrows to evaluate the benefit of biomass reduction in conjunction with the potential loss of available nutrients. The feedstock for the windrows was obtained from swine farm in Kentucky where swine manure was mixed with wood shaving as absorbent material. Static flux chambers along with photoacoustic gas analyzer were used to monitor ammonia and nitrous oxide concentrations during the composting process. The results show that ammonia and nitrous oxide fluxes were quite high during the initial composting process and after the turning of each compost pile. Over the period of roughly three months of composting, the biochemical oxygen demand (BOD) decreased by about 90%. Although composting of animal waste is quite beneficial for biomass reduction, composting may not be economically feasible from an agronomical point of view due to time, nutrient loss (N loss), and potential environmental pollution (ammonia and greenhouse gas emissions). Therefore, additional studies are needed to assess and validate the economics and environmental impact of animal (swine) manure composting (e.g., crop yield or impact on climate change).Keywords: windrow, swine manure, ammonia, nitrous oxide, fluxes, management
Procedia PDF Downloads 3573149 Structural, Electronic and Magnetic Properties of Co and Mn Doped CDTE
Authors: A. Zitouni, S. Bentata, B. Bouadjemi, T. Lantri, W. Benstaali, A. Zoubir, S. Cherid, A. Sefir
Abstract:
The structural, electronic, and magnetic properties of transition metal Co and Mn doped zinc-blende semiconductor CdTe were calculated using the density functional theory (DFT) with both generalized gradient approximation (GGA). We have analyzed the structural parameters, charge and spin densities, total and partial densities of states. We find that the Co and Mn doped zinc blende CdTe show half-metallic behavior with a total magnetic moment of 6.0 and 10.0 µB, respectively.The results obtained, make the Co and Mn doped CdTe a promising candidate for application in spintronics.Keywords: first-principles, half-metallic, diluted magnetic semiconductor, magnetic moment
Procedia PDF Downloads 4593148 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures
Authors: Rui Teixeira, Alan O’Connor, Maria Nogal
Abstract:
The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data
Procedia PDF Downloads 2733147 Using Wavelet Uncertainty Relations in Quantum Mechanics: From Trajectories Foam to Newtonian Determinism
Authors: Paulo Castro, J. R. Croca, M. Gatta, R. Moreira
Abstract:
Owing to the development of quantum mechanics, we will contextualize the foundations of the theory on the Fourier analysis framework, thus stating the unavoidable philosophical conclusions drawn by Niels Bohr. We will then introduce an alternative way of describing the undulatory aspects of quantum entities by using gaussian Morlet wavelets. The description has its roots in de Broglie's realistic program for quantum physics. It so happens that using wavelets it is possible to formulate a more general set of uncertainty relations. A set from which it is possible to theoretically describe both ends of the behavioral spectrum in reality: the indeterministic quantum trajectorial foam and the perfectly drawn Newtonian trajectories.Keywords: philosophy of quantum mechanics, quantum realism, morlet wavelets, uncertainty relations, determinism
Procedia PDF Downloads 1713146 Decolonial Theorization of Epistemic Agency in Language Policy Management: Case of Plurinational Ecuador
Authors: Magdalena Madany-Saá
Abstract:
This paper compares the language management of two language policies in plurinational Ecuador: (1) mandatory English language teaching that uses Western standards of quality, and (2) indigenous educación intercultural bilingüe, which promotes ancestral knowledge and the indigenous languages of Ecuador. The data are from a comparative institutional ethnography conducted between 2018 and 2022 in English and Kichwa teacher preparation programs in an Ecuadorian teachers’ college. Specifically, the paper explores frameworks of knowledge promoted by different educational actors in both teacher education programs and the ways in which the Ecuadorian transformation towards a knowledge-based economy is intertwined with the country’s linguistic policies. Focusing on the specific role of language advocates and their discursive role in knowledge production, the paper elaborates on the notion of agency in Language Policy and Planning (LPP), referred to as epistemic agency. Specifically, the epistemic agency is conceptualized through the analysis of English language epistemic advocates who participate in empowering English language policies and endorse knowledge production in that language. By proposing an epistemic agency, this paper argues that in the context of knowledge-based societies, advocates are key in transferring the policies from the political to the epistemic realm – where decisions about what counts as legitimate knowledge are made. The study uses the decolonial option as its analytical framework for critiquing the hegemonic perpetuation of modernity and its knowledge-based models in Latin America derived from the colonial matrix of power. Through this theoretical approach, it is argued that if indigenous stakeholders are only viewed as political actors and not as knowledge producers, the hegemony of Global English will reinforce a knowledge-based society constructed upon Global North modernity. In the absence of strong epistemic advocates for indigenous language policies, powerful Global English advocates occupy such vacancies at the language management level, thus dominating the ecology of knowledge in a plurinational and plurilingual Ecuador.Keywords: educación intercultural bilingüe, English language teaching, epistemic agency, language advocates, plurinationality
Procedia PDF Downloads 363145 Study and Analysis of a Susceptible Infective Susceptible Mathematical Model with Density Dependent Migration
Authors: Jitendra Singh, Vivek Kumar
Abstract:
In this paper, a susceptible infective susceptible mathematical model is proposed and analyzed where the migration of human population is given by migration function. It is assumed that the disease is transmitted by direct contact of susceptible and infective populations with constant contact rate. The equilibria and their stability are studied by using the stability theory of ordinary differential equations and computer simulation. The model analysis shows that the spread of infectious disease increases when human population immigration increases in the habitat but it decreases if emigration increases.Keywords: SIS (Susceptible Infective Susceptible) model, migration function, susceptible, stability
Procedia PDF Downloads 2613144 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine
Procedia PDF Downloads 1443143 Disparities in Language Competence and Conflict: The Moderating Role of Cultural Intelligence in Intercultural Interactions
Authors: Catherine Peyrols Wu
Abstract:
Intercultural interactions are becoming increasingly common in organizations and life. These interactions are often the stage of miscommunication and conflict. In management research, these problems are commonly attributed to cultural differences in values and interactional norms. As a result, the notion that intercultural competence can minimize these challenges is widely accepted. Cultural differences, however, are not the only source of a challenge during intercultural interactions. The need to rely on a lingua franca – or common language between people who have different mother tongues – is another important one. In theory, a lingua franca can improve communication and ease coordination. In practice however, disparities in people’s ability and confidence to communicate in the language can exacerbate tensions and generate inefficiencies. In this study, we draw on power theory to develop a model of disparities in language competence and conflict in a multicultural work context. Specifically, we hypothesized that differences in language competence between interaction partners would be positively related to conflict such that people would report greater conflict with partners who have more dissimilar levels of language competence and lesser conflict with partners with more similar levels of language competence. Furthermore, we proposed that cultural intelligence (CQ) an intercultural competence that denotes an individual’s capability to be effective in intercultural situations, would weaken the relationship between disparities in language competence and conflict such that people would report less conflict with partners who have more dissimilar levels of language competence when the interaction partner has high CQ and more conflict when the partner has low CQ. We tested this model with a sample of 135 undergraduate students working in multicultural teams for 13 weeks. We used a round-robin design to examine conflict in 646 dyads nested within 21 teams. Results of analyses using social relations modeling provided support for our hypotheses. Specifically, we found that in intercultural dyads with large disparities in language competence, partners with the lowest level of language competence would report higher levels of interpersonal conflict. However, this relationship disappeared when the partner with higher language competence was also high in CQ. These findings suggest that communication in a lingua franca can be a source of conflict in intercultural collaboration when partners differ in their level of language competence and that CQ can alleviate these effects during collaboration with partners who have relatively lower levels of language competence. Theoretically, this study underscores the benefits of CQ as a complement to language competence for intercultural effectiveness. Practically, these results further attest to the benefits of investing resources to develop language competence and CQ in employees engaged in multicultural work.Keywords: cultural intelligence, intercultural interactions, language competence, multicultural teamwork
Procedia PDF Downloads 1653142 A Case Study on Experiences of Clinical Preceptors in the Undergraduate Nursing Program
Authors: Jacqueline M. Dias, Amina A Khowaja
Abstract:
Clinical education is one of the most important components of a nursing curriculum as it develops the students’ cognitive, psychomotor and affective skills. Clinical teaching ensures the integration of knowledge into practice. As the numbers of students increase in the field of nursing coupled with the faculty shortage, clinical preceptors are the best choice to ensure student learning in the clinical settings. The clinical preceptor role has been introduced in the undergraduate nursing programme. In Pakistan, this role emerged due to a faculty shortage. Initially, two clinical preceptors were hired. This study will explore clinical preceptors views and experiences of precepting Bachelor of Science in Nursing (BScN) students in an undergraduate program. A case study design was used. As case studies explore a single unit of study such as a person or very small number of subjects; the two clinical preceptors were fundamental to the study and served as a single case. Qualitative data were obtained through an iterative process using in depth interviews and written accounts from reflective journals that were kept by the clinical preceptors. The findings revealed that the clinical preceptors were dedicated to their roles and responsibilities. Another, key finding was that clinical preceptors’ prior knowledge and clinical experience were valuable assets to perform their role effectively. The clinical preceptors found their new role innovative and challenging; it was stressful at the same time. Findings also revealed that in the clinical agencies there were unclear expectations and role ambiguity. Furthermore, clinical preceptors had difficulty integrating theory into practice in the clinical area and they had difficulty in giving feedback to the students. Although this study is localized to one university, generalizations can be drawn from the results. The key findings indicate that the role of a clinical preceptor is demanding and stressful. Clinical preceptors need preparation prior to precepting students on clinicals. Also, institutional support is fundamental for their acceptance. This paper focuses on the views and experiences of clinical preceptors undertaking a newly established role and resonates with the literature. The following recommendations are drawn to strengthen the role of the clinical preceptors: A structured program for clinical preceptors is needed along with mentorship. Clinical preceptors should be provided with formal training in teaching and learning with emphasis on clinical teaching and giving feedback to students. Additionally, for improving integration of theory into practice, clinical modules should be provided ahead of the clinical. In spite of all the challenges, ten more clinical preceptors have been hired as the faculty shortage continues to persist.Keywords: baccalaureate nursing education, clinical education, clinical preceptors, nursing curriculum
Procedia PDF Downloads 1743141 Applying the Quad Model to Estimate the Implicit Self-Esteem of Patients with Depressive Disorders: Comparing the Psychometric Properties with the Implicit Association Test Effect
Authors: Yi-Tung Lin
Abstract:
Researchers commonly assess implicit self-esteem with the Implicit Association Test (IAT). The IAT’s measure, often referred to as the IAT effect, indicates the strengths of automatic preferences for the self relative to others, which is often considered an index of implicit self-esteem. However, based on the Dual-process theory, the IAT does not rely entirely on the automatic process; it is also influenced by a controlled process. The present study, therefore, analyzed the IAT data with the Quad model, separating four processes on the IAT performance: the likelihood that automatic association is activated by the stimulus in the trial (AC); that a correct response is discriminated in the trial (D); that the automatic bias is overcome in favor of a deliberate response (OB); and that when the association is not activated, and the individual fails to discriminate a correct answer, there is a guessing or response bias drives the response (G). The AC and G processes are automatic, while the D and OB processes are controlled. The AC parameter is considered as the strength of the association activated by the stimulus, which reflects what implicit measures of social cognition aim to assess. The stronger the automatic association between self and positive valence, the more likely it will be activated by a relevant stimulus. Therefore, the AC parameter was used as the index of implicit self-esteem in the present study. Meanwhile, the relationship between implicit self-esteem and depression is not fully investigated. In the cognitive theory of depression, it is assumed that the negative self-schema is crucial in depression. Based on this point of view, implicit self-esteem would be negatively associated with depression. However, the results among empirical studies are inconsistent. The aims of the present study were to examine the psychometric properties of the AC (i.e., test-retest reliability and its correlations with explicit self-esteem and depression) and compare it with that of the IAT effect. The present study had 105 patients with depressive disorders completing the Rosenberg Self-Esteem Scale, Beck Depression Inventory-II and the IAT on the pretest. After at least 3 weeks, the participants completed the second IAT. The data were analyzed by the latent-trait multinomial processing tree model (latent-trait MPT) with the TreeBUGS package in R. The result showed that the latent-trait MPT had a satisfactory model fit. The effect size of test-retest reliability of the AC and the IAT effect were medium (r = .43, p < .0001) and small (r = .29, p < .01) respectively. Only the AC showed a significant correlation with explicit self-esteem (r = .19, p < .05). Neither of the two indexes was correlated with depression. Collectively, the AC parameter was a satisfactory index of implicit self-esteem compared with the IAT effect. Also, the present study supported the results that implicit self-esteem was not correlated with depression.Keywords: cognitive modeling, implicit association test, implicit self-esteem, quad model
Procedia PDF Downloads 1273140 Variation in N₂ Fixation and N Contribution by 30 Groundnut (Arachis hypogaea L.) Varieties Grown in Blesbokfontein Mpumalanga Province, South Africa
Authors: Titus Y. Ngmenzuma, Cherian. Mathews, Feilx D. Dakora
Abstract:
In Africa, poor nutrient availability, particularly N and P, coupled with low soil moisture due to erratic rainfall, constitutes the major crop production constraints. Although inorganic fertilizers are an option for meeting crop nutrient requirements for increased grain yield, the high cost and scarcity of inorganic inputs make them inaccessible to resource-poor farmers in Africa. Because crops grown on such nutrient-poor soils are micronutrient deficient, incorporating N₂-fixing legumes into cropping systems can sustainably improve crop yield and nutrient accumulation in the grain. In Africa, groundnut can easily form an effective symbiosis with native soil rhizobia, leading to marked N contribution in cropping systems. In this study, field experiments were conducted at Blesbokfontein in Mpumalanga Province to assess N₂ fixation and N contribution by 30 groundnut varieties during the 2018/2019 planting season using the ¹⁵N natural abundance technique. The results revealed marked differences in shoot dry matter yield, symbiotic N contribution, soil N uptake and grain yield among the groundnut varieties. The percent N derived from fixation ranged from 37 to 44% for varieties ICGV131051 and ICGV13984. The amount of N-fixed ranged from 21 to 58 kg/ha for varieties Chinese and IS-07273, soil N uptake from 31 to 80 kg/ha for varieties IS-07947 and IS-07273, and grain yield from 193 to 393 kg/ha for varieties ICGV15033 and ICGV131096, respectively. Compared to earlier studies on groundnut in South Africa, this study has shown low N₂ fixation and N contribution to the cropping systems, possibly due to environmental factors such as low soil moisture. Because the groundnut varieties differed in their growth, symbiotic performance and grain yield, more field testing is required over a range of differing agro-ecologies to identify genotypes suitable for different cropping environmentsKeywords: ¹⁵N natural abundance, percent N derived from fixation, amount of N-fixed, grain yield
Procedia PDF Downloads 1883139 Simulation of Scaled Model of Tall Multistory Structure: Raft Foundation for Experimental and Numerical Dynamic Studies
Authors: Omar Qaftan
Abstract:
Earthquakes can cause tremendous loss of human life and can result in severe damage to a several of civil engineering structures especially the tall buildings. The response of a multistory structure subjected to earthquake loading is a complex task, and it requires to be studied by physical and numerical modelling. For many circumstances, the scale models on shaking table may be a more economical option than the similar full-scale tests. A shaking table apparatus is a powerful tool that offers a possibility of understanding the actual behaviour of structural systems under earthquake loading. It is required to use a set of scaling relations to predict the behaviour of the full-scale structure. Selecting the scale factors is the most important steps in the simulation of the prototype into the scaled model. In this paper, the principles of scaling modelling procedure are explained in details, and the simulation of scaled multi-storey concrete structure for dynamic studies is investigated. A procedure for a complete dynamic simulation analysis is investigated experimentally and numerically with a scale factor of 1/50. The frequency domain accounting and lateral displacement for both numerical and experimental scaled models are determined. The procedure allows accounting for the actual dynamic behave of actual size porotype structure and scaled model. The procedure is adapted to determine the effects of the tall multi-storey structure on a raft foundation. Four generated accelerograms were used as inputs for the time history motions which are in complying with EC8. The output results of experimental works expressed regarding displacements and accelerations are compared with those obtained from a conventional fixed-base numerical model. Four-time history was applied in both experimental and numerical models, and they concluded that the experimental has an acceptable output accuracy in compare with the numerical model output. Therefore this modelling methodology is valid and qualified for different shaking table experiments tests.Keywords: structure, raft, soil, interaction
Procedia PDF Downloads 1363138 A Long Tail Study of eWOM Communities
Authors: M. Olmedilla, M. R. Martinez-Torres, S. L. Toral
Abstract:
Electronic Word-Of-Mouth (eWOM) communities represent today an important source of information in which more and more customers base their purchasing decisions. They include thousands of reviews concerning very different products and services posted by many individuals geographically distributed all over the world. Due to their massive audience, eWOM communities can help users to find the product they are looking for even if they are less popular or rare. This is known as the long tail effect, which leads to a larger number of lower-selling niche products. This paper analyzes the long tail effect in a well-known eWOM community and defines a tool for finding niche products unavailable through conventional channels.Keywords: eWOM, online user reviews, long tail theory, product categorization, social network analysis
Procedia PDF Downloads 4223137 A Syntactic Approach to Applied and Socio-Linguistics in Arabic Language in Modern Communications
Authors: Adeyemo Abduljeeel Taiwo
Abstract:
This research is an attempt that creates a conducive atmosphere of a phonological and morphological compendium of Arabic language in Modern Standard Arabic (MSA) for modern day communications. The research is carried out with the chief aim of grammatical analysis of the two broad fields of Arabic linguistics namely: Applied and Socio-Linguistics. It draws a pictorial record of Applied and Socio-Linguistics in Arabic phonology and morphology. Thematically, it postulates and contemplates to a large degree, the theory of concord in contemporary modern Arabic language acquisition. It utilizes an analytical method while it portrays Arabic as a Semitic language that promotes linguistics and syntax among the scholars of the fields.Keywords: Arabic language, applied linguistics, socio-linguistics, modern communications
Procedia PDF Downloads 3323136 Autophagy Suppresses Bladder Tumor Formation in a Mouse Orthotopic Bladder Tumor Formation Model
Authors: Wan-Ting Kuo, Yi-Wen Liu, Hsiao-Sheng Liu
Abstract:
Annual incidence of bladder cancer increases in the world and occurs frequently in the male. Most common type is transitional cell carcinoma (TCC) which is treated by transurethral resection followed by intravesical administration of agents. In clinical treatment of bladder cancer, chemotherapeutic drugs-induced apoptosis is always used in patients. However, cancers usually develop resistance to chemotherapeutic drugs and often lead to aggressive tumors with worse clinical outcomes. Approximate 70% TCC recurs and 30% recurrent tumors progress to high-grade invasive tumors, indicating that new therapeutic agents are urgently needed to improve the successful rate of overall treatment. Nonapoptotic program cell death may assist to overcome worse clinical outcomes. Autophagy which is one of the nonapoptotic pathways provides another option for bladder cancer patients. Autophagy is reported as a potent anticancer therapy in some cancers. First of all, we established a mouse orthotopic bladder tumor formation model in order to create a similar tumor microenvironment. IVIS system and micro-ultrasound were utilized to noninvasively monitor tumor formation. In addition, we carried out intravesical treatment in our animal model to be consistent with human clinical treatment. In our study, we carried out intravesical instillation of the autophagy inducer in mouse orthotopic bladder tumor to observe tumor formation by noninvasive IVIS system and micro-ultrasound. Our results showed that bladder tumor formation is suppressed by the autophagy inducer, and there are no significant side effects in the physiology of mice. Furthermore, the autophagy inducer upregulated autophagy in bladder tissues of the treated mice was confirmed by Western blot, immunohistochemistry, and immunofluorescence. In conclusion, we reveal that a novel autophagy inducer with low side effects suppresses bladder tumor formation in our mouse orthotopic bladder tumor model, and it provides another therapeutic approach in bladder cancer patients.Keywords: bladder cancer, transitional cell carcinoma, orthotopic bladder tumor formation model, autophagy
Procedia PDF Downloads 1773135 Investigating the Invalidity of the Law of Energy Conservation Based on Waves Interference Phenomenon Inside a Ringed Waveguide
Authors: M. Yusefzad
Abstract:
Law of energy conservation is one of the fundamental laws of physics. Energy is conserved, and the total amount of energy is constant. It can be transferred from one object to another and changed from one state to another. However, in the case of wave interference, this law faces important contradictions. Based on the presented mathematical relationship in this paper, it seems that validity of this law depends on the path of energy wave, like light, in which it is located. In this paper, by using some fundamental concepts in physics like the constancy of the electromagnetic wave speed in a specific media and wave theory of light, it will be shown that law of energy conservation is not valid in every condition and in some circumstances, it is possible to increase energy of a system with a determined amount of energy without any input.Keywords: power, law of energy conservation, electromagnetic wave, interference, Maxwell’s equations
Procedia PDF Downloads 2653134 Structural Stress of Hegemon’s Power Loss: A Pestle Analysis for Pacification and Security Policy Plan
Authors: Sehrish Qayyum
Abstract:
Active military power contention is shifting to economic and cyberwar to retain hegemony. Attuned Pestle analysis confirms that structural stress of hegemon’s power loss drives a containment approach towards caging actions. Ongoing diplomatic, asymmetric, proxy and direct wars are increasing stress hegemon’s power retention due to tangled military and economic alliances. It creates the condition of catalepsy with defective reflexive control which affects the core warfare operations. When one’s own power is doubted it gives power to one’s own doubt to ruin all planning either done with superlative cost-benefit analysis. Strategically calculated estimation of Hegemon’s power game since the early WWI to WWII, WWII-to Cold War and then to the current era in three chronological periods exposits that Thucydides’s trap became the reason for war broke out. Thirst for power is the demise of imagination and cooperation for better sense to prevail instead it drives ashes to dust. Pestle analysis is a wide array of evaluation from political and economic to legal dimensions of the state matters. It helps to develop the Pacification and Security Policy Plan (PSPP) to avoid hegemon’s structural stress of power loss in fact, in turn, creates an alliance with maximum amicable outputs. PSPP may serve to regulate and pause the hurricane of power clashes. PSPP along with a strategic work plan is based on Pestle analysis to deal with any conceivable war condition and approach for saving international peace. Getting tangled into self-imposed epistemic dilemmas results in regret that becomes the only option of performance. It is a generic application of probability tests to find the best possible options and conditions to develop PSPP for any adversity possible so far. Innovation in expertise begets innovation in planning and action-plan to serve as a rheostat approach to deal with any plausible power clash.Keywords: alliance, hegemon, pestle analysis, pacification and security policy plan, security
Procedia PDF Downloads 1063133 Label Survey in Romania: A Study on How Consumers Use Food Labeling
Authors: Gabriela Iordachescu, Mariana Cretu Stuparu, Mirela Praisler, Camelia Busila, Doina Voinescu, Camelia Vizireanu
Abstract:
The aim of the study was to evaluate the consumers’ degree of confidence in food labeling, how they use and understand the label and respectively food labeling elements. The label is a bridge between producers, suppliers, and consumers. It has to offer enough information in terms of public health and food safety, statement of ingredients, nutritional information, warnings and advisory statements, producing date and shelf-life, instructions for storage and preparation (if required). The survey was conducted on 500 consumers group in Romania, aged 15+, males and females, from urban and rural areas and with different graduation levels. The questionnaire was distributed face to face and online. It had single or multiple choices questions and label images for the efficiency and best understanding of the question. The law 1169/2011 applied to food products from 13 of December 2016 improved and adapted the requirements for labeling in a clear manner. The questions were divided on following topics: interest and general trust in labeling, use and understanding of label elements, understanding of the ingredient list and safety information, nutrition information, advisory statements, serving sizes, best before/use by meanings, intelligent labeling, and demographic data. Three choice selection exercises were also included. In this case, the consumers had to choose between two similar products and evaluate which label element is most important in product choice. The data were analysed using MINITAB 17 and PCA analysis. Most of the respondents trust the food label, taking into account some elements especially when they buy the first time the product. They usually check the sugar content and type of sugar, saturated fat and use the mandatory label elements and nutrition information panel. Also, the consumers pay attention to advisory statements, especially if one of the items is relevant to them or the family. Intelligent labeling is a challenging option. In addition, the paper underlines that the consumer is more careful and selective with the food consumption and the label is the main helper for these.Keywords: consumers, food safety information, labeling, labeling nutritional information
Procedia PDF Downloads 2183132 Nanofocusing of Surface Plasmon Polaritons by Partially Metal- Coated Dielectric Conical Probe: Optimal Asymmetric Distance
Authors: Ngo Thi Thu, Kazuo Tanaka, Masahiro Tanaka, Dao Ngoc Chien
Abstract:
Nanometric superfocusing of optical intensity near the tip of partially metal- coated dielectric conical probe of the convergent surface plasmon polariton wave is investigated by the volume integral equation method. It is possible to perform nanofocusing using this probe by using both linearly and radially polarized Gaussian beams as the incident waves. Strongly localized and enhanced optical near-fields can be created on the tip of this probe for the cases of both incident Gaussian beams. However the intensity distribution near the probe tip was found to be very sensitive to the shape of the probe tip.Keywords: waveguide, surface plasmons, electromagnetic theory
Procedia PDF Downloads 4773131 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 3463130 The Determinants and Effects of R&D Outsourcing in Korean Manufacturing Firm
Authors: Sangyun Han, Minki Kim
Abstract:
R&D outsourcing is a strategy for acquiring the competitiveness of firms as an open innovation strategy. As increasing total R&D investment of firms, the ratio of amount of R&D outsourcing in it is also increased in Korea. In this paper, we investigate the determinants and effects of R&D outsourcing of firms. Through analyzing the determinants of R&D outsourcing and effect on firm’s performance, we can find some academic and politic issues. Firstly, in the point of academic view, distinguishing the determinants of R&D outsourcing is linked why the firms do open innovation. It can be answered resource based view, core competence theory, and etc. Secondly, we can get some S&T politic implication for transferring the public intellectual properties to private area. Especially, for supporting the more SMEs or ventures, government can get the basement and the reason why and how to make the policies.Keywords: determinants, effects, R&D, outsourcing
Procedia PDF Downloads 5063129 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol
Authors: Inkyu Kim, SangMan Moon
Abstract:
This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application
Procedia PDF Downloads 392