Search results for: superficial temporal artery
893 Detection of Atrial Fibrillation Using Wearables via Attentional Two-Stream Heterogeneous Networks
Authors: Huawei Bai, Jianguo Yao, Fellow, IEEE
Abstract:
Atrial fibrillation (AF) is the most common form of heart arrhythmia and is closely associated with mortality and morbidity in heart failure, stroke, and coronary artery disease. The development of single spot optical sensors enables widespread photoplethysmography (PPG) screening, especially for AF, since it represents a more convenient and noninvasive approach. To our knowledge, most existing studies based on public and unbalanced datasets can barely handle the multiple noises sources in the real world and, also, lack interpretability. In this paper, we construct a large- scale PPG dataset using measurements collected from PPG wrist- watch devices worn by volunteers and propose an attention-based two-stream heterogeneous neural network (TSHNN). The first stream is a hybrid neural network consisting of a three-layer one-dimensional convolutional neural network (1D-CNN) and two-layer attention- based bidirectional long short-term memory (Bi-LSTM) network to learn representations from temporally sampled signals. The second stream extracts latent representations from the PPG time-frequency spectrogram using a five-layer CNN. The outputs from both streams are fed into a fusion layer for the outcome. Visualization of the attention weights learned demonstrates the effectiveness of the attention mechanism against noise. The experimental results show that the TSHNN outperforms all the competitive baseline approaches and with 98.09% accuracy, achieves state-of-the-art performance.Keywords: PPG wearables, atrial fibrillation, feature fusion, attention mechanism, hyber network
Procedia PDF Downloads 121892 A Comparative Analysis of (De)legitimation Strategies in Selected African Inaugural Speeches
Authors: Lily Chimuanya, Ehioghae Esther
Abstract:
Language, a versatile and sophisticated tool, is fundamentally sacrosanct to mankind especially within the realm of politics. In this dynamic world, political leaders adroitly use language to engage in a strategic show aimed at manipulating or mechanising the opinion of discerning people. This nuanced synergy is marked by different rhetorical strategies, meticulously synced with contextual factors ranging from cultural, ideological, and political to achieve multifaceted persuasive objectives. This study investigates the (de)legitimation strategies inherent in African presidential inaugural speeches, as African leaders not only state their policy agenda through inaugural speeches but also subtly indulge in a dance of legitimation and delegitimation, performing a twofold objective of strengthening the credibility of their administration and, at times, undermining the performance of the past administration. Drawing insights from two different legitimation models and a dataset of 4 African presidential inaugural speeches obtained from authentic websites, the study describes the roles of authorisation, rationalisation, moral evaluation, altruism, and mythopoesis in unmasking the structure of political discourse. The analysis takes a mixed-method approach to unpack the (de)legitimation strategy embedded in the carefully chosen speeches. The focus extends beyond a superficial exploration and delves into the linguistic elements that form the basis of presidential discourse. In conclusion, this examination goes beyond the nuanced landscape of language as a potent tool in politics, with each strategy contributing to the overall rhetorical impact and shaping the narrative. From this perspective, the study argues that presidential inaugural speeches are not only linguistic exercises but also viable weapons that influence perceptions and legitimise authority.Keywords: CDA, legitimation, inaugural speeches, delegitmation
Procedia PDF Downloads 69891 Application of MRI in Radioembolization Imaging and Dosimetry
Authors: Salehi Zahabi Saleh, Rajabi Hosaien, Rasaneh Samira
Abstract:
Yttrium-90 (90Y) radioembolisation(RE) is increasingly used for the treatment of patients with unresectable primary or metastatic liver tumours. Image-based approaches to assess microsphere distribution after RE have gained interest but are mostly hampered by the limited imaging possibilities of the Isotope 90Y. Quantitative 90Y-SPECT imaging has limited spatial resolution because it is based on 90Y Bremsstrahlung whereas 90Y-PET has better spatial resolution but low sensitivity. As a consequence, new alternative methods of visualizing the microspheres have been investigated, such as MR imaging of iron-labelled microspheres. It was also shown that MRI combines high sensitivity with high spatial and temporal resolution and with superior soft tissue contrast and thus can be used to cover a broad range of clinically interesting imaging parameters.The aim of the study in this article was to investigate the capability of MRI to measure the intrahepatic microsphere distribution in order to quantify the absorbed radiation dose in RE.Keywords: radioembolisation, MRI, imaging, dosimetry
Procedia PDF Downloads 320890 OpenMP Parallelization of Three-Dimensional Magnetohydrodynamic Code FOI-PERFECT
Authors: Jiao F. Huang, Shi Chen, Shu C. Duan, Gang H. Wang
Abstract:
Due to its complex spatial structure as well as dynamic temporal evolution, an analytic solution of an X-pinch process is out of question, and numerical simulation becomes an important tool in X-pinch studies. Intrinsically, simulations of X-pinch are three-dimensional (3D) because of the specific structure of its load. Furthermore, in order to resolve both its μm-scales and ns-durations, fine spatial mesh grid and short time steps are usually adopted. The resulting large computational scales make the parallelization of codes a vital problem to be solved if any practical simulations are to be carried out. In this work, we report OpenMP parallelization of our 3D magnetohydrodynamic (MHD) code FOI-PERFECT. Results of test runs confirm that computational efficiency has been improved after parallelization, and both the sequential and parallel versions give the same physical results under the same initial conditions.Keywords: MHD simulation, OpenMP, parallelization, X-pinch
Procedia PDF Downloads 340889 Free Secondary Education in Tanzania: Prospects, Challenges, and Proposals
Authors: Yazidu Saidi Mbalamula
Abstract:
Free Basic Education (FBE) policy implementation in Secondary Schools has been one of thrilled undertaking both to the government and household in Tanzania. On the one hand, the government has achieved citizenry acceptance to responsibility and accountability, and on the other hand, the household has been relieved from social costs that were unbearable and deprived many Tanzanians access to basic education and secondary education in particular. Specifically, this study presents a descriptive survey conducted in two districts of Kagera region located at the northern part of Tanzania. Three objectives were pursued to identify achievements realized and challenges in the FBE implementation, and also stakeholders’ proposals were explored on how to improve FBE implementation. A sample of 91 respondents, including school managers, teachers, students, and parents, were involved in the study. Both questionnaires and interviews were used whereby the quantitative data were analyzed using Statistical Package for Social Sciences (SPSS), and content analysis was used to analyze the qualitative data. The results show that implementation of free education policy in secondary schools had far positive impact on the improvement of school management, school attendance, reduced school drop-out, reduced parents-school managers conflicts, and increased enrollment rates. Notwithstanding that, the political machinery remains instrumental to instigate policy reforms in education sector. Nevertheless, the alienating interests of politibureau, often top-down and blanketed by superficial government redness, can hardly be feasible to wield such huge programme given staggering stakeholders’ awareness of the actual requirements and unlatching resources to back up policy implementation. The study recommends that further studies on stakeholders’ conceptions on the FBE and equity of financing of basic education in Tanzania.Keywords: capitation grant, CCM, free basic education, kagera, education policy
Procedia PDF Downloads 72888 Complex Network Approach to International Trade of Fossil Fuel
Authors: Semanur Soyyigit Kaya, Ercan Eren
Abstract:
Energy has a prominent role for development of nations. Countries which have energy resources also have strategic power in the international trade of energy since it is essential for all stages of production in the economy. Thus, it is important for countries to analyze the weakness and strength of the system. On the other side, it is commonly believed that international trade has complex network properties. Complex network is a tool for the analysis of complex systems with heterogeneous agents and interaction between them. A complex network consists of nodes and the interactions between these nodes. Total properties which emerge as a result of these interactions are distinct from the sum of small parts (more or less) in complex systems. Thus, standard approaches to international trade are superficial to analyze these systems. Network analysis provides a new approach to analyze international trade as a network. In this network countries constitute nodes and trade relations (export or import) constitute edges. It becomes possible to analyze international trade network in terms of high degree indicators which are specific to complex systems such as connectivity, clustering, assortativity/disassortativity, centrality, etc. In this analysis, international trade of crude oil and coal which are types of fossil fuel has been analyzed from 2005 to 2014 via network analysis. First, it has been analyzed in terms of some topological parameters such as density, transitivity, clustering etc. Afterwards, fitness to Pareto distribution has been analyzed. Finally, weighted HITS algorithm has been applied to the data as a centrality measure to determine the real prominence of countries in these trade networks. Weighted HITS algorithm is a strong tool to analyze the network by ranking countries with regards to prominence of their trade partners. We have calculated both an export centrality and an import centrality by applying w-HITS algorithm to data.Keywords: complex network approach, fossil fuel, international trade, network theory
Procedia PDF Downloads 336887 Layer-By-Layer Deposition of Poly(Ethylene Imine) Nanolayers on Polypropylene Nonwoven Fabric: Electrostatic and Thermal Properties
Authors: Dawid Stawski, Silviya Halacheva, Dorota Zielińska
Abstract:
The surface properties of many materials can be readily and predictably modified by the controlled deposition of thin layers containing appropriate functional groups and this research area is now a subject of widespread interest. The layer-by-layer (lbl) method involves depositing oppositely charged layers of polyelectrolytes onto the substrate material which are stabilized due to strong electrostatic forces between adjacent layers. This type of modification affords products that combine the properties of the original material with the superficial parameters of the new external layers. Through an appropriate selection of the deposited layers, the surface properties can be precisely controlled and readily adjusted in order to meet the requirements of the intended application. In the presented paper a variety of anionic (poly(acrylic acid)) and cationic (linear poly(ethylene imine), polymers were successfully deposited onto the polypropylene nonwoven using the lbl technique. The chemical structure of the surface before and after modification was confirmed by reflectance FTIR spectroscopy, volumetric analysis and selective dyeing tests. As a direct result of this work, new materials with greatly improved properties have been produced. For example, following a modification process significant changes in the electrostatic activity of a range of novel nanocomposite materials were observed. The deposition of polyelectrolyte nanolayers was found to strongly accelerate the loss of electrostatically generated charges and to increase considerably the thermal resistance properties of the modified fabric (the difference in T50% is over 20°C). From our results, a clear relationship between the type of polyelectrolyte layer deposited onto the flat fabric surface and the properties of the modified fabric was identified.Keywords: layer-by-layer technique, polypropylene nonwoven, surface modification, surface properties
Procedia PDF Downloads 435886 Efficacy of Erector Spinae Plane Block for Postoperative Pain Management in Coronary Artery Bypass Graft Patients
Authors: Santosh Sharma Parajuli, Diwas Manandhar
Abstract:
Background: Perioperative pain management plays an integral part in patients undergoing cardiac surgery. We studied the effect of Erector Spinae Plane block on acute postoperative pain reduction and 24 hours opioid consumption in adult cardiac surgical patients. Methods: Twenty-five adult cardiac surgical patients who underwent cardiac surgery with sternotomy in whom ESP catheters were placed preoperatively were kept in group E, and the other 25 patients who had undergone cardiac surgery without ESP catheter and pain management done with conventional opioid injection were placed in group C. Fentanyl was used for pain management. The primary study endpoint was to compare the consumption of fentanyl and to assess the numeric rating scale in the postoperative period in the first 24 hours in both groups. Results: The 24 hours fentanyl consumption was 43.00±51.29 micrograms in the Erector Spinae Plane catheter group and 147.00±60.94 micrograms in the control group postoperatively which was statistically significant (p <0.001). The numeric rating scale was also significantly reduced in the Erector Spinae Plane group compared to the control group in the first 24 hours postoperatively. Conclusion: Erector Spinae Plane block is superior to the conventional opioid injection method for postoperative pain management in CABG patients. Erector Spinae Plane block not only decreases the overall opioid consumption but also the NRS score in these patients.Keywords: erector, spinae, plane, numerical rating scale
Procedia PDF Downloads 66885 X-Ray Detector Technology Optimization In CT Imaging
Authors: Aziz Ikhlef
Abstract:
Most of multi-slices CT scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80kVp and 140kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts
Procedia PDF Downloads 271884 Teaching Critical Thinking in Post-Conflict Countries: The University of Liberia
Authors: Kamille Beye
Abstract:
Critical thinking is a topic that has been disputed in the field of education for decades, but many resulting debates have centered around strengthening critical thinking capabilities in the societies, workforces, and educational centers of the global north. In contrast, this paper provides an analysis of the teaching of critical thinking in Liberia, which has been ravaged by years of war and a recent Ebola outbreak. These crises have decimated the Liberian education sector, leading to a loss of teaching capacities that are essential to providing critical thinking education. Until recently, critical thinking had no seat at the table when the future needs of the country were discussed by the government and non-governmental agencies. Now, the University of Liberia has a bold goal to become one of the top twenty universities in West Africa in the next seven years, which has led to a focus on teaching critical thinking skills to improve learning. This paper argues that critical thinking is essential to strengthening not only the Liberian education system, but for promoting peace amongst community members, and yet it suggests that commitments to the teaching of critical thinking in Liberia have hitherto been overly superficial. Based on an initial scoping study, this paper will examine the potential impacts of teaching critical thinking skills to undergraduate students in the William V. S. Tubman School of Education at the University of Liberia on continued peacebuilding and reconstruction efforts of the country. The research contends that if critical thinking skills are taught, practiced and continually utilized, teachers and students will have the ability to engage with information and negotiate challenges to solutions in ways that are beneficial to the communities in which they live. The research will use a variety of methods, that include the California Critical Thinking Disposition Inventory. This research will demonstrate that critical thinking skills are not only needed for entering the workforce, but necessary for negotiating and expressing the needs and desires of local communities in a peaceful way.Keywords: critical thinking, higher education, Liberia, peacebuilding, post-conflict
Procedia PDF Downloads 135883 Study of the Behavior of PM₁₀ and SO₂ in the Urban Atmosphere of Sfax: Influence of Anthropised Contributions and Special Meteorological Conditions, 2008
Authors: Allagui Mohamed
Abstract:
The study of the temporal variation of the PM10 and the SO₂ in the area of Sfax during the year of 2008, showed very significant fluctuations of the contents. They depend on the transmitting sources and the weather conditions. The study of the evolutionary behavior of the PM10 and the SO₂ in a situation of the Sirocco revealed the determining influence of the Sahara which was confirmed by its strong enrichment of the atmosphere with particulate matter. The analysis of a situation of breeze of sea highlighted the increase in the contents of the PM10 of agreement with the increase the speed of the marine wind, in particular for the diurnal period, possibly testifying to the enrichment of the aerosol in a considerable maritime component. A situation of anticyclonic winter examined when with it the accumulation of the contents of the PM10 at a rate of 70 μg/m³ showed such concentrations remained weak by comparison with other studies which show contents of about 300 μg/m³.Keywords: PM10, sea breeze, SO₂, sirocco, anticyclone
Procedia PDF Downloads 126882 Robotic Lingulectomy for Primary Lung Cancer: A Video Presentation
Authors: Abraham J. Rizkalla, Joanne F. Irons, Christopher Q. Cao
Abstract:
Purpose: Lobectomy was considered the standard of care for early-stage non-small lung cancer (NSCLC) after the Lung Cancer Study Group trial demonstrated increased locoregional recurrence for sublobar resections. However, there has been heightened interest in segmentectomies for selected patients with peripheral lesions ≤2cm, as investigated by the JCOG0802 and CALGB140503 trials. Minimally invasive robotic surgery facilitates segmentectomies with improved maneuverability and visualization of intersegmental planes using indocyanine green. We hereby present a patient who underwent robotic lingulectomy for an undiagnosed ground-glass opacity. Methodology: This video demonstrates a robotic portal lingulectomy using three 8mm ports and a 12mm port. Stereoscopic direct vision facilitated the identification of the lingula artery and vein, and intra-operative bronchoscopy was performed to confirm the lingula bronchus. The intersegmental plane was identified by indocyanine green and a near-infrared camera. Thorough lymph node sampling was performed in accordance with international standards. Results: The 18mm lesion was successfully excised with clear margins to achieve R0 resection with no evidence of malignancy in the 8 lymph nodes sampled. Histopathological examination revealed lepidic predominant adenocarcinoma, pathological stage IA. Conclusion: This video presentation exemplifies the standard approach for robotic portal lingulectomy in appropriately selected patients.Keywords: lung cancer, robotic segmentectomy, indocyanine green, lingulectomy
Procedia PDF Downloads 67881 Overhead Lines Induced Transient Overvoltage Analysis Using Finite Difference Time Domain Method
Authors: Abdi Ammar, Ouazir Youcef, Laissaoui Abdelmalek
Abstract:
In this work, an approach based on transmission lines theory is presented. It is exploited for the calculation of overvoltage created by direct impacts of lightning waves on a guard cable of an overhead high-voltage line. First, we show the theoretical developments leading to the propagation equation, its discretization by finite difference time domain method (FDTD), and the resulting linear algebraic equations, followed by the calculation of the linear parameters of the line. The second step consists of solving the transmission lines system of equations by the FDTD method. This enabled us to determine the spatio-temporal evolution of the induced overvoltage.Keywords: lightning surge, transient overvoltage, eddy current, FDTD, electromagnetic compatibility, ground wire
Procedia PDF Downloads 83880 Emerging Virtual Linguistic Landscape Created by Members of Language Community in TikTok
Authors: Kai Zhu, Shanhua He, Yujiao Chang
Abstract:
This paper explores the virtual linguistic landscape of an emerging virtual language community in TikTok, a language community realizing immediate and non-immediate communication without a precise Spatio-temporal domain or a specific socio-cultural boundary or interpersonal network. This kind of language community generates a large number and various forms of virtual linguistic landscape, with which we conducted a virtual ethnographic survey together with telephone interviews to collect data from coping. We have been following two language communities in TikTok for several months so that we can illustrate the composition of the two language communities and some typical virtual language landscapes in both language communities first. Then we try to explore the reasons why and how they are formed through the organization, transcription, and analysis of the interviews. Our analysis reveals the richness and diversity of the virtual linguistic landscape, and finally, we summarize some of the characteristics of this language community.Keywords: virtual linguistic landscape, virtual language community, virtual ethnographic survey, TikTok
Procedia PDF Downloads 103879 X-Ray Detector Technology Optimization in Computed Tomography
Authors: Aziz Ikhlef
Abstract:
Most of multi-slices Computed Tomography (CT) scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This is translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80 kVp and 140 kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts
Procedia PDF Downloads 194878 Digital Transformation as the Subject of the Knowledge Model of the Discursive Space
Authors: Rafal Maciag
Abstract:
Due to the development of the current civilization, one must create suitable models of its pervasive massive phenomena. Such a phenomenon is the digital transformation, which has a substantial number of disciplined, methodical interpretations forming the diversified reflection. This reflection could be understood pragmatically as the current temporal, a local differential state of knowledge. The model of the discursive space is proposed as a model for the analysis and description of this knowledge. Discursive space is understood as an autonomous multidimensional space where separate discourses traverse specific trajectories of what can be presented in multidimensional parallel coordinate system. Discursive space built on the world of facts preserves the complex character of that world. Digital transformation as a discursive space has a relativistic character that means that at the same time, it is created by the dynamic discourses and these discourses are molded by the shape of this space.Keywords: complexity, digital transformation, discourse, discursive space, knowledge
Procedia PDF Downloads 192877 Determining of the Performance of Data Mining Algorithm Determining the Influential Factors and Prediction of Ischemic Stroke: A Comparative Study in the Southeast of Iran
Authors: Y. Mehdipour, S. Ebrahimi, A. Jahanpour, F. Seyedzaei, B. Sabayan, A. Karimi, H. Amirifard
Abstract:
Ischemic stroke is one of the common reasons for disability and mortality. The fourth leading cause of death in the world and the third in some other sources. Only 1/3 of the patients with ischemic stroke fully recover, 1/3 of them end in permanent disability and 1/3 face death. Thus, the use of predictive models to predict stroke has a vital role in reducing the complications and costs related to this disease. Thus, the aim of this study was to specify the effective factors and predict ischemic stroke with the help of DM methods. The present study was a descriptive-analytic study. The population was 213 cases from among patients referring to Ali ibn Abi Talib (AS) Hospital in Zahedan. Data collection tool was a checklist with the validity and reliability confirmed. This study used DM algorithms of decision tree for modeling. Data analysis was performed using SPSS-19 and SPSS Modeler 14.2. The results of the comparison of algorithms showed that CHAID algorithm with 95.7% accuracy has the best performance. Moreover, based on the model created, factors such as anemia, diabetes mellitus, hyperlipidemia, transient ischemic attacks, coronary artery disease, and atherosclerosis are the most effective factors in stroke. Decision tree algorithms, especially CHAID algorithm, have acceptable precision and predictive ability to determine the factors affecting ischemic stroke. Thus, by creating predictive models through this algorithm, will play a significant role in decreasing the mortality and disability caused by ischemic stroke.Keywords: data mining, ischemic stroke, decision tree, Bayesian network
Procedia PDF Downloads 174876 Allium Cepa Extract Provides Neuroprotection Against Ischemia Reperfusion Induced Cognitive Dysfunction and Brain Damage in Mice
Authors: Jaspal Rana, Alkem Laboratories, Baddi, Himachal Pradesh, India Chitkara University, Punjab, India
Abstract:
Oxidative stress has been identified as an underlying cause of ischemia-reperfusion (IR) related cognitive dysfunction and brain damage. Therefore, antioxidant based therapies to treat IR injury are being investigated. Allium cepa L. (onion) is used as culinary medicine and is documented to have marked antioxidant effects. Hence, the present study was designed to evaluate the effect of A. cepa outer scale extract (ACE) against IR induced cognition and biochemical deficit in mice. ACE was prepared by maceration with 70% methanol and fractionated into ethylacetate and aqueous fractions. Bilateral common carotid artery occlusion for 10 min followed by 24 h reperfusion was used to induce cerebral IR injury. Following IR injury, ACE (100 and 200 mg/kg) was administered orally to animals for 7 days once daily. Behavioral outcomes (memory and sensorimotor functions) were evaluated using Morris water maze and neurological severity score. Cerebral infarct size, brain thiobarbituric acid reactive species, reduced glutathione, and superoxide dismutase activity was also determined. Treatment with ACE significantly ameliorated IR mediated deterioration of memory and sensorimotor functions and rise in brain oxidative stress in animals. The results of the present investigation revealed that ACE improved functional outcomes after cerebral IR injury, which may be attributed to its antioxidant properties.Keywords: stroke, neuroprotection, ischemia reperfusion, herbal drugs
Procedia PDF Downloads 106875 Effect of Inspiratory Muscle Training on Diaphragmatic Strength Following Coronary Revascularization
Authors: Abeer Ahmed Abdelhamed
Abstract:
Introduction: Postoperative pulmonary complications (PPCs) are the most common complications observed and managed after abdominal or cardiothoracic surgery. Hypoxemia, atelectasis, pleural effusion, or diaphragmatic dysfunction, are often a source of morbidity in cardiac surgery patients, and are more common in patients receiving unilateral or bilateral internal mammary artery (IMT) grafts than patients receiving saphenous vein (SV) grafts alone. Purpose: The aim of this work was to investigate the effect of Threshold load inspiratory muscle training on pulmonary gas exchange and maximum inspiratory pressure (MIP) in patient undergoing coronary revascularization. Subject: Thirty three male patients eligible for coronary revascularization were selected to participate in the study. Method: They were divided into two groups(17 patients in the intervention group and 16 patients in the control group), the interventional group received inspiratory muscle training at 30% of their maximum inspiratory pressure throughout the hospitalization period in addition to routine post operative care. Result: The results of this study showed a significant improvement on maximum inspiratory pressure(MIP), Arterial-alveolar pressure gradient (A-a gradient) and oxygen saturation in the intervention group. Conclusion: Inspiratory muscle training using threshold mode significantly improves maximum inspiratory pressure, pulmonary gas exchange tested by alveolar-arterial gradient and oxygen saturation in Patients undergoing coronary revascularization.Keywords: coronary revascularization, inspiratory muscle training, maximum inspiratory pressure, pulmonary gas exchange
Procedia PDF Downloads 300874 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions
Authors: Pirta Palola, Richard Bailey, Lisa Wedding
Abstract:
Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.Keywords: economics of biodiversity, environmental valuation, natural capital, value function
Procedia PDF Downloads 194873 Parallelization by Domain Decomposition for 1-D Sugarcane Equation with Message Passing Interface
Authors: Ewedafe Simon Uzezi
Abstract:
In this paper we presented a method based on Domain Decomposition (DD) for parallelization of 1-D Sugarcane Equation on parallel platform with parallel paradigms on Master-Slave platform using Message Passing Interface (MPI). The 1-D Sugarcane Equation was discretized using explicit method of discretization requiring evaluation nof temporal and spatial distribution of temperature. This platform gives better predictions of the effects of temperature distribution of the sugarcane problem. This work presented parallel overheads with overlapping communication and communication across parallel computers with numerical results across different block sizes with scalability. However, performance improvement strategies from the DD on various mesh sizes were compared experimentally and parallel results show speedup and efficiency for the parallel algorithms design.Keywords: sugarcane, parallelization, explicit method, domain decomposition, MPI
Procedia PDF Downloads 21872 Poincare Plot for Heart Rate Variability
Authors: Mazhar B. Tayel, Eslam I. AlSaba
Abstract:
The heart is the most important part in any body organisms. It effects and affected by any factor in the body. Therefore, it is a good detector of any matter in the body. When the heart signal is non-stationary signal, therefore, it should be study its variability. So, the Heart Rate Variability (HRV) has attracted considerable attention in psychology, medicine and have become important dependent measure in psychophysiology and behavioral medicine. Quantification and interpretation of heart rate variability. However, remain complex issues are fraught with pitfalls. This paper presents one of the non-linear techniques to analyze HRV. It discusses 'What Poincare plot is?', 'How it is work?', 'its usage benefits especially in HRV', 'the limitation of Poincare cause of standard deviation SD1, SD2', and 'How overcome this limitation by using complex correlation measure (CCM)'. The CCM is most sensitive to changes in temporal structure of the Poincaré plot as compared to SD1 and SD2.Keywords: heart rate variability, chaotic system, poincare, variance, standard deviation, complex correlation measure
Procedia PDF Downloads 399871 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables
Procedia PDF Downloads 338870 Temporal Variation of PM10-Bound Benzo(a)Pyrene Concentration in an Urban and a Rural Site of Northwestern Hungary
Authors: Zs. Csanádi, A. Szabó Nagy, J. Szabó, J. Erdős
Abstract:
The main objective of this study was to assess the annual concentration and seasonal variation of benzo(a)pyrene (BaP) associated with PM10 in an urban site of Győr and in a rural site of Sarród in the sampling period of 2008–2012. A total of 280 PM10 aerosol samples were collected in each sampling site and analyzed for BaP by gas chromatography method. The BaP concentrations ranged from undetected to 8 ng/m3 with the mean value of 1.01 ng/m3 in the sampling site of Győr, and from undetected to 4.07 ng/m3 with the mean value of 0.52 ng/m3 in the sampling site of Sarród, respectively. Relatively higher concentrations of BaP were detected in samples collected in both sampling sites in the heating seasons compared with non-heating periods. The annual mean BaP concentrations were comparable with the published data of different other Hungarian sites.Keywords: air quality, benzo(a)pyrene, PAHs, polycyclic aromatic hydrocarbons
Procedia PDF Downloads 392869 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes
Authors: Nadarajah I. Ramesh
Abstract:
Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model
Procedia PDF Downloads 278868 Compromised Sexual Territoriality under Reflexive Cosmopolitanism: From Coffee Bean to Gay Bean in South Korea
Authors: Robert Christopher Hamilton
Abstract:
This research examined the effects of reflexive cosmopolitanism on the competition for sexual territoriality. By adopting Michel De Certeau’s (1984) spatial didactic model, the article maps out the key elements at play and the dynamics explaining how gays gay place in the backdrop of rapid modernization. It found that heterosexual space and heteronormative assumptions helped to create temporal and spatial opportunities that allow for sexual performativity of gay males. Moreover, using data collected from multiple semi-controlled one-on-one interviews over 13 months, this article illustrates how spatial competition culminates in non-zero sum game outcomes and particularly to compromise of sexual territoriality while further demonstrating the need to understand the sexual coping tactics used in cultures with similar backgrounds. The findings enable researchers to better understand how gay men gay space, and how space performatively embodies gay men.Keywords: South Korea, coffee bean, sexual territoriality, reflexive cosmopolitanism
Procedia PDF Downloads 327867 Complicated Corneal Ulceration in Cats: Clinical Diagnosis and Surgical Management of 80 Cases
Authors: Khaled M. Ali, Ayman A. Mostafa, Soliman M. Soliman
Abstract:
Objectives: To describe the most common clinical and endoscopic findings associated with complicated corneal ulcers in cats, and to determine the short-term outcomes after surgical treatment of these cats. Animals Eighteen client-owned cats of different breeds (52 females and 28 males), ranging in age from 3 months to 6 years, with corneal ulcers. Procedures: Cats were clinically evaluated to initially determine the concurrent corneal abnormalities. Endoscopic examination was performed to determine the anterior and posterior segments abnormalities. Superficial and deep stromal ulcers were treated using conjunctival flap. Corneal sequestrum was treated by partial keratectomy and conjunctival flap. Anterior synechia was treated via peripheral iridectomy and separation of the adhesion between the iris and the inner cornea. Symblepharon was treated by removal of the adhered conjunctival membrane from the cornea. Incurable endophthalmitis was treated surgically by extirpation. Short-term outcomes after surgical managements of selected corneal abnormalities were then assessed clinically and endoscopically. Results: Deep stromal ulcer with descemetocele, endophthalmitis, symblepharon, corneal sequestration and anterior synechia with secondary glaucoma and corneal scarring were the most common complications of corneal ulcer. FHV-1 was a common etiologic factor of corneal ulceration. Persistent corneal scars of varying shape and size developed in cats with deep stromal ulcer, anterior synechia, and corneal sequestration. Conclusions: Domestic shorthaired and Persian cats were the most predisposed breeds to FHV-1 infection and subsequent corneal ulceration. Immediate management of patients with corneal ulcer would prevent serious complications. No age or sex predisposition to complicated corneal ulceration in cats.Keywords: cats, complicated corneal ulceration, clinical, endoscopic diagnosis, FHV-1
Procedia PDF Downloads 283866 Perception of Tactile Stimuli in Children with Autism Spectrum Disorder
Authors: Kseniya Gladun
Abstract:
Tactile stimulation of a dorsal side of the wrist can have a strong impact on our attitude toward physical objects such as pleasant and unpleasant impact. This study explored different aspects of tactile perception to investigate atypical touch sensitivity in children with autism spectrum disorder (ASD). This study included 40 children with ASD and 40 healthy children aged 5 to 9 years. We recorded rsEEG (sampling rate of 250 Hz) during 20 min using EEG amplifier “Encephalan” (Medicom MTD, Taganrog, Russian Federation) with 19 AgCl electrodes placed according to the International 10–20 System. The electrodes placed on the left, and right mastoids served as joint references under unipolar montage. The registration of EEG v19 assignments was carried out: frontal (Fp1-Fp2; F3-F4), temporal anterior (T3-T4), temporal posterior (T5-T6), parietal (P3-P4), occipital (O1-O2). Subjects were passively touched by 4 types of tactile stimuli on the left wrist. Our stimuli were presented with a velocity of about 3–5 cm per sec. The stimuli materials and procedure were chosen for being the most "pleasant," "rough," "prickly" and "recognizable". Type of tactile stimulation: Soft cosmetic brush - "pleasant" , Rough shoe brush - "rough", Wartenberg pin wheel roller - "prickly", and the cognitive tactile stimulation included letters by finger (most of the patient’s name ) "recognizable". To designate the moments of the stimuli onset-offset, we marked the moment when the moment of the touch began and ended; the stimulation was manual, and synchronization was not precise enough for event-related measures. EEG epochs were cleaned from eye movements by ICA-based algorithm in EEGLAB plugin for MatLab 7.11.0 (Mathwork Inc.). Muscle artifacts were cut out by manual data inspection. The response to tactile stimuli was significantly different in the group of children with ASD and healthy children, which was also depended on type of tactile stimuli and the severity of ASD. Amplitude of Alpha rhythm increased in parietal region to response for only pleasant stimulus, for another type of stimulus ("rough," "thorny", "recognizable") distinction of amplitude was not observed. Correlation dimension D2 was higher in healthy children compared to children with ASD (main effect ANOVA). In ASD group D2 was lower for pleasant and unpleasant compared to the background in the right parietal area. Hilbert transform changes in the frequency of the theta rhythm found only for a rough tactile stimulation compared with healthy participants only in the right parietal area. Children with autism spectrum disorders and healthy children were responded to tactile stimulation differently with specific frequency distribution alpha and theta band in the right parietal area. Thus, our data supports the hypothesis that rsEEG may serve as a sensitive index of altered neural activity caused by ASD. Children with autism have difficulty in distinguishing the emotional stimuli ("pleasant," "rough," "prickly" and "recognizable").Keywords: autism, tactile stimulation, Hilbert transform, pediatric electroencephalography
Procedia PDF Downloads 250865 Investigation of Chronic Drug Use Due to Chronic Diseases in Patients Admitted to Emergency Department
Authors: Behcet Al, Şener Cindoruk, Suat Zengin, Mehmet Murat Oktay, Mehmet Mustafa Sunar, Hatice Eroglu, Cuma Yildirim
Abstract:
Objective: In present study we aimed to investigate the chronic drug use due to chronic diseases in patients admitted to emergency department. Materials-Methods: 144 patients who applied to emergency department (ED) of medicine school of Gaziantep University between June 2013 and September 2013 with chronic diseases and use chronic drugs were included. Information about drugs used by patients were recorded. Results: Of patients, half were male, half were female, and the mean age was 58 years. The first three common diseases were diabetes mellitus, hypertension and coronary artery diseases. Of patients, %79.2 knew their illness. Fifty patients began to use drug within three months, 36 patient began to use within the last one year. While 42 patients brought all of their drugs with themselves, 17 patients brought along a portion of drugs. While three patients stopped their medication completely, 125 patients received medication on a regular basis. Fifty-two patient described the drugs with names, 13 patients described with their colors, 3 patients described by grammes, 45 patients described with the size of the tablet and 13 patients could not describe the drugs. Ninety-two patients explained which kind of drugs were used for each diseases, 17 patient explained partly, and 35 patients had no idea. Hundred patients received medication by themselves, 44 patients medications were giving by their relatives and med carers. Of medications, 140 were written by doctors directly, three medication were given by pharmacist; and one patient bought the drug by himself. For 11 patients the drugs were not harmonious to their diseases. Fifty-one patients admitted to the ED two times within last week, and 73 admitted two times within last month. Conclusion: The majority of patients with chronic diseases and use chronic drugs know their diseases and use the drugs in order, but do not have enough information about their medication.Keywords: chronic disease, drug use, emergency department, medication
Procedia PDF Downloads 463864 A Critical Exploration of Dominant Perspectives Regarding Inclusion and Disability: Shifts Toward Meaningful Approaches
Authors: Luigi Iannacci
Abstract:
This study critically explores how disability and disability are presently and problematically configured within education. As such, pedagogies, discourses, and practices that shape this configuration are examined to forward a reconceptualization of disability as it relates to education and the inclusion of students with special needs in mainstream classroom contexts. The study examines how the dominant medical/deficit model of disability positions students with special needs and advocates for a shift towards a social/critical model of disability as applied to education and classrooms. This is demonstrated through a critical look at how language, processes, and ‘interventions’ name and address deficits people who have a disability are presumed to have and, as such, conceptualize these deficits as inherent flaws that are in need of ‘fixing.’ The study will demonstrate the necessary shifts in thinking, language and practice required to forward a critical/social model of disability. The ultimate aim of this research is to offer a much-needed reconceptualization of inclusion that recognizes disability as epistemology, identity, and diversity through a critical exploration of dominant discourses that impact language, policy, instruction and ultimately, the experiences students with disabilities have within mainstream classrooms. The presentation seeks to explore disability as neurodiversity and therefore elucidate how people with disabilities can demonstrate these ways of knowing within inclusive education that avoids superficial approaches that are not responsive to their needs. This research is, therefore, of interest and use to educators teaching at the elementary, secondary, and in-service levels as well as graduate students and scholars working in the areas of inclusion, special education, and literacy. Ultimately the presentation attempts to foster a social justice and human rights-focused approach to inclusion that is responsive to students with disabilities and, as such ensures a reconceptualization of present language, understandings and practices that continue to configure disability in problematic ways.Keywords: inclusion, disability, critical approach, social justice
Procedia PDF Downloads 75