Search results for: MBES echo sounder
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 127

Search results for: MBES echo sounder

37 In vitro Characterization of Mice Bone Microstructural Changes by Low-Field and High-Field Nuclear Magnetic Resonance

Authors: Q. Ni, J. A. Serna, D. Holland, X. Wang

Abstract:

The objective of this study is to develop Nuclear Magnetic Resonance (NMR) techniques to enhance bone related research applied on normal and disuse (Biglycan knockout) mice bone in vitro by using both low-field and high-field NMR simultaneously. It is known that the total amplitude of T₂ relaxation envelopes, measured by the Carr-Purcell-Meiboom-Gill NMR spin echo train (CPMG), is a representation of the liquid phase inside the pores. Therefore, the NMR CPMG magnetization amplitude can be transferred to the volume of water after calibration with the NMR signal amplitude of the known volume of the selected water. In this study, the distribution of mobile water, porosity that can be determined by using low-field (20 MHz) CPMG relaxation technique, and the pore size distributions can be determined by a computational inversion relaxation method. It is also known that the total proton intensity of magnetization from the NMR free induction decay (FID) signal is due to the water present inside the pores (mobile water), the water that has undergone hydration with the bone (bound water), and the protons in the collagen and mineral matter (solid-like protons). Therefore, the components of total mobile and bound water within bone that can be determined by low-field NMR free induction decay technique. Furthermore, the bound water in solid phase (mineral and organic constituents), especially, the dominated component of calcium hydroxyapatite (Ca₁₀(OH)₂(PO₄)₆) can be determined by using high-field (400 MHz) magic angle spinning (MAS) NMR. With MAS technique reducing NMR spectral linewidth inhomogeneous broadening and susceptibility broadening of liquid-solid mix, in particular, we can conduct further research into the ¹H and ³¹P elements and environments of bone materials to identify the locations of bound water such as OH- group within minerals and bone architecture. We hypothesize that with low-field and high-field magic angle spinning NMR can provide a more complete interpretation of water distribution, particularly, in bound water, and these data are important to access bone quality and predict the mechanical behavior of bone.

Keywords: bone, mice bone, NMR, water in bone

Procedia PDF Downloads 145
36 Multiphase Flow Regime Detection Algorithm for Gas-Liquid Interface Using Ultrasonic Pulse-Echo Technique

Authors: Serkan Solmaz, Jean-Baptiste Gouriet, Nicolas Van de Wyer, Christophe Schram

Abstract:

Efficiency of the cooling process for cryogenic propellant boiling in engine cooling channels on space applications is relentlessly affected by the phase change occurs during the boiling. The effectiveness of the cooling process strongly pertains to the type of the boiling regime such as nucleate and film. Geometric constraints like a non-transparent cooling channel unable to use any of visualization methods. The ultrasonic (US) technique as a non-destructive method (NDT) has therefore been applied almost in every engineering field for different purposes. Basically, the discontinuities emerge between mediums like boundaries among different phases. The sound wave emitted by the US transducer is both transmitted and reflected through a gas-liquid interface which makes able to detect different phases. Due to the thermal and structural concerns, it is impractical to sustain a direct contact between the US transducer and working fluid. Hence the transducer should be located outside of the cooling channel which results in additional interfaces and creates ambiguities on the applicability of the present method. In this work, an exploratory research is prompted so as to determine detection ability and applicability of the US technique on the cryogenic boiling process for a cooling cycle where the US transducer is taken place outside of the channel. Boiling of the cryogenics is a complex phenomenon which mainly brings several hindrances for experimental protocol because of thermal properties. Thus substitute materials are purposefully selected based on such parameters to simplify experiments. Aside from that, nucleate and film boiling regimes emerging during the boiling process are simply simulated using non-deformable stainless steel balls, air-bubble injection apparatuses and air clearances instead of conducting a real-time boiling process. A versatile detection algorithm is perennially developed concerning exploratory studies afterward. According to the algorithm developed, the phases can be distinguished 99% as no-phase, air-bubble, and air-film presences. The results show the detection ability and applicability of the US technique for an exploratory purpose.

Keywords: Ultrasound, ultrasonic, multiphase flow, boiling, cryogenics, detection algorithm

Procedia PDF Downloads 143
35 Computed Tomography Myocardial Perfusion on a Patient with Hypertrophic Cardiomyopathy

Authors: Jitendra Pratap, Daphne Prybyszcuk, Luke Elliott, Arnold Ng

Abstract:

Introduction: Coronary CT angiography is a non-invasive imaging technique for the assessment of coronary artery disease and has high sensitivity and negative predictive value. However, the correlation between the degree of CT coronary stenosis and the significance of hemodynamic obstruction is poor. The assessment of myocardial perfusion has mostly been undertaken by Nuclear Medicine (SPECT), but it is now possible to perform stress myocardial CT perfusion (CTP) scans quickly and effectively using CT scanners with high temporal resolution. Myocardial CTP is in many ways similar to neuro perfusion imaging technique, where radiopaque iodinated contrast is injected intravenously, transits the pulmonary and cardiac structures, and then perfuses through the coronary arteries into the myocardium. On the Siemens Force CT scanner, a myocardial perfusion scan is performed using a dynamic axial acquisition, where the scanner shuffles in and out every 1-3 seconds (heart rate dependent) to be able to cover the heart in the z plane. This is usually performed over 38 seconds. Report: A CT myocardial perfusion scan can be utilised to complement the findings of a CT Coronary Angiogram. Implementing a CT Myocardial Perfusion study as part of a routine CT Coronary Angiogram procedure provides a ‘One Stop Shop’ for diagnosis of coronary artery disease. This case study demonstrates that although the CT Coronary Angiogram was within normal limits, the perfusion scan provided additional, clinically significant information in regards to the haemodynamics within the myocardium of a patient with Hypertrophic Obstructive Cardio Myopathy (HOCM). This negated the need for further diagnostics studies such as cardiac ECHO or Nuclear Medicine Stress tests. Conclusion: CT coronary angiography with adenosine stress myocardial CTP was utilised in this case to specifically exclude coronary artery disease in conjunction with accessing perfusion within the hypertrophic myocardium. Adenosine stress myocardial CTP demonstrated the reduced myocardial blood flow within the hypertrophic myocardium, but the coronary arteries did not show any obstructive disease. A CT coronary angiogram scan protocol that incorporates myocardial perfusion can provide diagnostic information on the haemodynamic significance of any coronary artery stenosis and has the potential to be a “One Stop Shop” for cardiac imaging.

Keywords: CT, cardiac, myocardium, perfusion

Procedia PDF Downloads 94
34 Exploring the History of Chinese Music Acoustic Technology through Data Fluctuations

Authors: Yang Yang, Lu Xin

Abstract:

The study of extant musical sites can provide a side-by-side picture of historical ethnomusicological information. In their data collection on Chinese opera houses, researchers found that one Ming Dynasty opera house reached a width of nearly 18 meters, while all opera houses of the same period and after it was far from such a width, being significantly smaller than 18 meters. The historical transient fluctuations in the data dimension of width that caused Chinese theatres to fluctuate in the absence of construction scale constraints have piqued the interest of researchers as to why there is data variation in width. What factors have contributed to the lack of further expansion in the width of theatres? To address this question, this study used a comparative approach to conduct a venue experiment between this theater stage and another theater stage for non-heritage opera performances, collecting the subjective perceptions of performers and audiences at different theater stages, as well as combining BK Connect platform software to measure data such as echo and delay. From the subjective and objective results, it is inferred that the Chinese ancients discovered and understood the acoustical phenomenon of the Haas effect by exploring the effect of stage width on musical performance and appreciation of listening states during the Ming Dynasty and utilized this discovery to serve music in subsequent stage construction. This discovery marked a node of evolution in Chinese architectural acoustics technology driven by musical demands. It is also instructive to note that, in contrast to many of the world's "unsuccessful civilizations," China can use a combination of heritage and intangible cultural research to chart a clear, demand-driven course for the evolution of human music technology, and that the findings of such research will complete the course of human exploration of music acoustics. The findings of such research will complete the journey of human exploration of music acoustics, and this practical experience can be applied to the exploration and understanding of other musical heritage base data.

Keywords: Haas effect, musical acoustics, history of acoustical technology, Chinese opera stage, structure

Procedia PDF Downloads 156
33 Role of Tele-health in Expansion of Medical Care

Authors: Garima Singh, Kunal Malhotra

Abstract:

Objective: The expansion of telehealth has been instrumental in increasing access to medical services, especially for underserved and rural communities. In 2020, 14 million patients received virtual care through telemedicine and the global telemedicine market is expected to reach up to $185 million by 2023. It provides a platform and allows a patient to receive primary care as well as specialized care using technology and the comfort of their homes. Telemedicine was particularly useful during COVID-pandemic and the number of telehealth visits increased by 5000% during that time. It continues to serve as a significant resource for patients seeking care and to bridge the gap between the disease and the treatment. Method: As per APA (American Psychiatric Association), Telemedicine is the process of providing health care from a distance through technology. It is a subset of telemedicine, and can involve providing a range of services, including evaluations, therapy, patient education and medication management. It can involve direct interaction between a physician and the patient. It also encompasses supporting primary care providers with specialist consultation and expertise. It can also involve recording medical information (images, videos, etc.) and sending this to a distant site for later review. Results: In our organization, we are using telepsychiatry and serving 25 counties and approximately 1.4 million people. We provide multiple services, including inpatient, outpatient, crisis intervention, Rehab facility, autism services, case management, community treatment and multiple other modalities. With project ECHO (Extension for Community Healthcare Outcomes) it has been used to advise and assist primary care providers in treating mental health. It empowers primary care providers to treat patients in their own community by sharing knowledge. Conclusion: Telemedicine has shown to be a great medium in meeting patients’ needs and accessible mental health. It has been shown to improve access to care in both urban and rural settings by bringing care to a patient and reducing barriers like transportation, financial stress and resources. Telemedicine is also helping with reducing ER visits, integrating primary care and improving the continuity of care and follow-up. There has been substantial evidence and research about its effectiveness and its usage.

Keywords: telehealth, telemedicine, access to care, medical technology

Procedia PDF Downloads 72
32 Berberine Ameliorates Glucocorticoid-Induced Hyperglycemia: An In-Vitro and In-Vivo Study

Authors: Mrinal Gupta, Mohammad Rumman, Babita Singh Abbas Ali Mahdi, Shivani Pandey

Abstract:

Introduction: Berberine (BBR), a bioactive compound isolated from Coptidis Rhizoma, possesses diverse pharmacological activities, including anti-bacterial, anti-inflammatory, antitumor, hypolipidemic, and anti-diabetic. However, its role as an anti-diabetic agent in animal models of dexamethasone (Dex)-induced diabetes remains unknown. Studies have shown that natural compounds, including aloe, caper, cinnamon, cocoa, green and black tea, and turmeric, can be used for treating Type 2 diabetes mellitus (DM). Compared to conventional drugs, natural compounds have fewer side effects and are easily available. Herein, we studied the anti-diabetic effects of BBR in a mice model of Dex-induced diabetes. Methods: HepG2 cell line was used for glucose release and glycogen synthesis studies. Cell proliferation was measured by methylthiotetrazole (MTT) assay. For animal studies, mice were treated with Dex (2 mg/kg, i.m.) for 30 days and the effect of BBR at the doses 100, 200, and 500 mg/kg (p.o.) was analyzed. Glucose, insulin, and pyruvate tests were performed to evaluate the development of the diabetic model. An echo MRI was performed to assess the fat mass. Further, to elucidate the mechanism of action of BBR, mRNA expression of genes regulating gluconeogenesis, glucose uptake, and glycolysis were analyzed. Results: In vitro BBR had no impact on cell viability up to a concentration of 50μM. Moreover, BBR suppressed the hepatic glucose release and improved glucose tolerance in HepG2 cells. In vivo, BBR improved glucose homeostasis in diabetic mice, as evidenced by enhanced glucose clearance, increased glycolysis, elevated glucose uptake, and decreased gluconeogenesis. Further, Dex treatment increased the total fat mass in mice, which was ameliorated by BBR treatment. Conclusion: BBR improves glucose tolerance by increasing glucose clearance, inhibiting hepatic glucose release, and decreasing obesity. Thus, BBR may become a potential therapeutic agent for treating glucocorticoid-induced diabetes and obesity in the future.

Keywords: glucocorticoid, hyperglycemia, berberine, HepG2 cells, insulin resistance, glucose

Procedia PDF Downloads 33
31 Application of Compressed Sensing and Different Sampling Trajectories for Data Reduction of Small Animal Magnetic Resonance Image

Authors: Matheus Madureira Matos, Alexandre Rodrigues Farias

Abstract:

Magnetic Resonance Imaging (MRI) is a vital imaging technique used in both clinical and pre-clinical areas to obtain detailed anatomical and functional information. However, MRI scans can be expensive, time-consuming, and often require the use of anesthetics to keep animals still during the imaging process. Anesthetics are commonly administered to animals undergoing MRI scans to ensure they remain still during the imaging process. However, prolonged or repeated exposure to anesthetics can have adverse effects on animals, including physiological alterations and potential toxicity. Minimizing the duration and frequency of anesthesia is, therefore, crucial for the well-being of research animals. In recent years, various sampling trajectories have been investigated to reduce the number of MRI measurements leading to shorter scanning time and minimizing the duration of animal exposure to the effects of anesthetics. Compressed sensing (CS) and sampling trajectories, such as cartesian, spiral, and radial, have emerged as powerful tools to reduce MRI data while preserving diagnostic quality. This work aims to apply CS and cartesian, spiral, and radial sampling trajectories for the reconstruction of MRI of the abdomen of mice sub-sampled at levels below that defined by the Nyquist theorem. The methodology of this work consists of using a fully sampled reference MRI of a female model C57B1/6 mouse acquired experimentally in a 4.7 Tesla MRI scanner for small animals using Spin Echo pulse sequences. The image is down-sampled by cartesian, radial, and spiral sampling paths and then reconstructed by CS. The quality of the reconstructed images is objectively assessed by three quality assessment techniques RMSE (Root mean square error), PSNR (Peak to Signal Noise Ratio), and SSIM (Structural similarity index measure). The utilization of optimized sampling trajectories and CS technique has demonstrated the potential for a significant reduction of up to 70% of image data acquisition. This result translates into shorter scan times, minimizing the duration and frequency of anesthesia administration and reducing the potential risks associated with it.

Keywords: compressed sensing, magnetic resonance, sampling trajectories, small animals

Procedia PDF Downloads 40
30 The Diary of Dracula, by Marin Mincu: Inquiries into a Romanian 'Book of Wisdom' as a Fictional Counterpart for Corpus Hermeticum

Authors: Lucian Vasile Bagiu, Paraschiva Bagiu

Abstract:

The novel written in Italian and published in Italy in 1992 by the Romanian scholar Marin Mincu is meant for the foreign reader, aiming apparently at a better knowledge of the historical character of Vlad the Empalor (Vlad Dracul), within the European cultural, political and historical context of 1463. Throughout the very well written tome, one comes to realize that one of the underlining levels of the fiction is the exposing of various fundamental features of the Romanian culture and civilization. The author of the diary, Dracula, makes mention of Corpus Hermeticum no less than fifteen times, suggesting his own diary is some sort of a philosophical counterpart. The essay focuses on several ‘truths’ and ‘wisdom’ revealed in the fictional teachings of Dracula. The boycott of History by the Romanians is identified as an echo of the philosophical approach of the famous Romanian scholar and writer Lucian Blaga. The orality of the Romanian culture is a landmark opposed to written culture of the Western Europe. The religion of the ancient Dacian God Zalmoxis is seen as the basis for the Romanian existential and/or metaphysical ethnic philosophy (a feature tackled by the famous Romanian historian of religion Mircea Eliade), with a suggestion that Hermes Trismegistus may have written his Corpus Hermeticum being influenced by Zalmoxis. The historical figure of the last Dacian king Decebalus (death 106 AD) is a good pretext for a tantalizing Indo-European suggestion that the prehistoric Thraco-Dacian people may have been the ancestors of the first Romans settled in Latium. The lost diary of the Emperor Trajan The Bello Dacico may have proved that the unknown language of the Dacians was very much alike Latin language (a secret well hidden by the Vatican). The attitude towards death of the Dacians, as described by Herodotus, may have later inspired Pitagora, Socrates, the Eleusinian and Orphic Mysteries, etc. All of these within the Humanistic and Renascentist European context of the epoch, Dracula having a close relationship with scholars such as Nicolaus Cusanus, Cosimo de Medici, Marsilio Ficino, Pope Pius II, etc. Thus The Diary of Dracula turns out as exciting and stupefying as Corpus Hermeticum, a book impossible to assimilate entirely, yet a reference not wise to be ignored.

Keywords: Corpus Hermeticum, Dacians, Dracula, Zalmoxis

Procedia PDF Downloads 131
29 Patterns of Eosinophilia in Cardiac Patients and its Association with Endomyocardial Disease Presenting to Tertiary Care Hospital in Peshawar

Authors: Rashid Azeem

Abstract:

Introduction: Eosinophilia, which can be categorized as mild, moderate, and severe form on the basis of increasing eosinophil counts, might be responsible for a wide range of cardiac manifestations, varying from a simple myocarditis to a severe state like endomyocardial fibrosis. Eosinophils are involved in the pathogenesis of a variety of cardiovascular disorder like Loffler endocarditis, eosinophilic granulomatosis with polyangitis (EGPH), and hyper eosinophilic (HES). Among them HES carries and incidence rate b/w 48% and 75% and is the main causes of cardiac motility and mobility due to eosinophilia involvement. Aims and objectives: The aim of this study is to determine the frequency of eosinophilia in cardiac patients and to ascertain the evidence of endomyocardial diseases in eosinophilic patients in a cardiology institution Material and Methods: This cross sectional analytical study was conducted in hematology Department of Peshawar institute of Cardiology after approval from hospital ethical and research committee. All 70 patients were subjected to detailed history and clinical examination. Investigation like CBC, Chest X-ray, ECG, Echo, Angiography findings were used to monitor patient’s clinical status. Data is analyzed using SPSS version 25 and MS Excel. Results: Out of 70 patients in our study, a total of 66 patients(94 %) shows evidence of cardiac manifestations. In our study, we have observed a number of abnormal ECG patterns in cardiac patients presenting with eosinophilia, like T wave changes, loss of R wave, sinus bradycardia with LVH strain, and ST wave abnormality. abnormal echocardiographic findings were observed in our patients, like valvular abnormalities (in 45.7%), RWMA abnormalities (in 2.8%), isolated ventricular dysfunction (in 21.4%), and in 10% patients, normal echocardiography. We further noted abnormal coronary angiography findings in cardiac patients with eosinophilia ranging from single vessel to multi vessel occlusions. Conclusions: Eosinophils are involved in the pathogenesis of a variety of cardiovascular disorders which can be detected by various diagnostic means, and the severity of the disease increases with time and with increasing eosinophil count ranging from simple myocarditis to a fatal condition like endomyocardial fibrosis. Thus, increased eosinophilic count as a laboratory parameter in cardiac patients may be a sign of endomyocardial damage which will further help cardiologist to intervene more aggressively then routine approach to a cardiac patient.

Keywords: eosinophilia, endomyocardial fibrosis, cardiac, hypereosinophilic syndrome

Procedia PDF Downloads 35
28 Enhancement Effect of Superparamagnetic Iron Oxide Nanoparticle-Based MRI Contrast Agent at Different Concentrations and Magnetic Field Strengths

Authors: Bimali Sanjeevani Weerakoon, Toshiaki Osuga, Takehisa Konishi

Abstract:

Magnetic Resonance Imaging Contrast Agents (MRI-CM) are significant in the clinical and biological imaging as they have the ability to alter the normal tissue contrast, thereby affecting the signal intensity to enhance the visibility and detectability of images. Superparamagnetic Iron Oxide (SPIO) nanoparticles, coated with dextran or carboxydextran are currently available for clinical MR imaging of the liver. Most SPIO contrast agents are T2 shortening agents and Resovist (Ferucarbotran) is one of a clinically tested, organ-specific, SPIO agent which has a low molecular carboxydextran coating. The enhancement effect of Resovist depends on its relaxivity which in turn depends on factors like magnetic field strength, concentrations, nanoparticle properties, pH and temperature. Therefore, this study was conducted to investigate the impact of field strength and different contrast concentrations on enhancement effects of Resovist. The study explored the MRI signal intensity of Resovist in the physiological range of plasma from T2-weighted spin echo sequence at three magnetic field strengths: 0.47 T (r1=15, r2=101), 1.5 T (r1=7.4, r2=95), and 3 T (r1=3.3, r2=160) and the range of contrast concentrations by a mathematical simulation. Relaxivities of r1 and r2 (L mmol-1 Sec-1) were obtained from a previous study and the selected concentrations were 0.05, 0.06, 0.07, 0.08, 0.09, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 2.0, and 3.0 mmol/L. T2-weighted images were simulated using TR/TE ratio as 2000 ms /100 ms. According to the reference literature, with increasing magnetic field strengths, the r1 relaxivity tends to decrease while the r2 did not show any systematic relationship with the selected field strengths. In parallel, this study results revealed that the signal intensity of Resovist at lower concentrations tends to increase than the higher concentrations. The highest reported signal intensity was observed in the low field strength of 0.47 T. The maximum signal intensities for 0.47 T, 1.5 T and 3 T were found at the concentration levels of 0.05, 0.06 and 0.05 mmol/L, respectively. Furthermore, it was revealed that, the concentrations higher than the above, the signal intensity was decreased exponentially. An inverse relationship can be found between the field strength and T2 relaxation time, whereas, the field strength was increased, T2 relaxation time was decreased accordingly. However, resulted T2 relaxation time was not significantly different between 0.47 T and 1.5 T in this study. Moreover, a linear correlation of transverse relaxation rates (1/T2, s–1) with the concentrations of Resovist can be observed. According to these results, it can conclude that the concentration of SPIO nanoparticle contrast agents and the field strengths of MRI are two important parameters which can affect the signal intensity of T2-weighted SE sequence. Therefore, when MR imaging those two parameters should be considered prudently.

Keywords: Concentration, resovist, field strength, relaxivity, signal intensity

Procedia PDF Downloads 331
27 The Phenomenology in the Music of Debussy through Inspiration of Western and Oriental Culture

Authors: Yu-Shun Elisa Pong

Abstract:

Music aesthetics related to phenomenology is rarely discussed and still in the ascendant while multi-dimensional discourses of philosophy were emerged to be an important trend in the 20th century. In the present study, a basic theory of phenomenology from Edmund Husserl (1859-1938) is revealed and discussed followed by the introduction of intentionality concepts, eidetic reduction, horizon, world, and inter-subjectivity issues. Further, phenomenology of music and general art was brought to attention by the introduction of Roman Ingarden’s The Work of Music and the Problems of its Identity (1933) and Mikel Dufrenne’s The Phenomenology of Aesthetic Experience (1953). Finally, Debussy’s music will be analyzed and discussed from the perspective of phenomenology. Phenomenology is not so much a methodology or analytics rather than a common belief. That is, as much as possible to describe in detail the different human experience, relative to the object of purpose. Such idea has been practiced in various guises for centuries, only till the early 20th century Phenomenology was better refined through the works of Husserl, Heidegger, Sartre, Merleau-Ponty and others. Debussy was born in an age when the Western society began to accept the multi-cultural baptism. With his unusual sensitivity to the oriental culture, Debussy has presented considerable inspiration, absorption, and echo in his music works. In fact, his relationship with nature is far from echoing the idea of Chinese ancient literati and nature. Although he is not the first composer to associate music with human and nature, the unique quality and impact of his works enable him to become a significant figure in music aesthetics. Debussy’s music tried to develop a quality analogous of nature, and more importantly, based on vivid life experience and artistic transformation to achieve the realm of pure art. Such idea that life experience comes before artwork, either clear or vague, simple or complex, was later presented abstractly in his late works is still an interesting subject worth further discussion. Debussy’s music has existed for more than or close to a century. It has received musicology researcher’s attention as much as other important works in the history of Western music. Among the pluralistic discussion about Debussy’s art and ideas, phenomenological aesthetics has enlightened new ideas and view angles to relook his great works and even gave some previous arguments legitimacy. Overall, this article provides a new insight of Debussy’s music from phenomenological exploration and it is believed phenomenology would be an important pathway in the research of the music aesthetics.

Keywords: Debussy's music, music esthetics, oriental culture, phenomenology

Procedia PDF Downloads 235
26 Perception of Eco-Music From the Contents the Earth’s Sound Ecosystem

Authors: Joni Asitashvili, Eka Chabashvili, Maya Virsaladze, Alexander Chokhonelidze

Abstract:

Studying the soundscape is a major challenge in many countries of the civilized world today. The sound environment and music itself are part of the Earth's ecosystem. Therefore, researching its positive or negative impact is important for a clean and healthy environment. The acoustics of nature gave people many musical ideas, and people enriched musical features and performance skills with the ability to imitate the surrounding sound. For example, a population surrounded by mountains invented the technique of antiphonal singing, which mimics the effect of an echo. Canadian composer Raymond Murray Schafer viewed the world as a kind of musical instrument with ever-renewing tuning. He coined the term "Soundscape" as a name of a natural environmental sound, including the sound field of the Earth. It can be said that from which the “music of nature” is constructed. In the 21st century, a new field–Ecomusicology–has emerged in the field of musical art to study the sound ecosystem and various issues related to it. Ecomusicology considers the interconnections between music, culture, and nature–According to the Aaron Allen. Eco-music is a field of ecomusicology concerning with the depiction and realization of practical processes using modern composition techniques. Finding an artificial sound source (instrumental or electronic) for the piece that will blend into the soundscape of Sound Oases. Creating a composition, which sounds in harmony with the vibrations of human, nature, environment, and micro- macrocosm as a whole; Currently, we are exploring the ambient sound of the Georgian urban and suburban environment to discover “Sound Oases" and compose Eco-music works. We called “Sound Oases" an environment with a specific sound of the ecosystem to use in the musical piece as an instrument. The most interesting examples of Eco-music are the round dances, which were already created in the BC era. In round dances people would feel the united energy. This urge to get united revealed itself in our age too, manifesting itself in a variety of social media. The virtual world, however, is not enough for a healthy interaction; we created plan of “contemporary round dance” in sound oasis, found during expedition in Georgian caves, where people interacted with cave's soundscape and eco-music, they feel each other sharing energy and listen to earth sound. This project could be considered a contemporary round dance, a long improvisation, particular type of art therapy, where everyone can participate in an artistic process. We would like to present research result of our eco-music experimental performance.

Keywords: eco-music, environment, sound, oasis

Procedia PDF Downloads 36
25 Predictors of Pericardial Effusion Requiring Drainage Following Coronary Artery Bypass Graft Surgery: A Retrospective Analysis

Authors: Nicholas McNamara, John Brookes, Michael Williams, Manish Mathew, Elizabeth Brookes, Tristan Yan, Paul Bannon

Abstract:

Objective: Pericardial effusions are an uncommon but potentially fatal complication after cardiac surgery. The goal of this study was to describe the incidence and risk factors associated with the development of pericardial effusion requiring drainage after coronary artery bypass graft surgery (CABG). Methods: A retrospective analysis was undertaken using prospectively collected data. All adult patients who underwent CABG at our institution between 1st January 2017 and 31st December 2018 were included. Pericardial effusion was diagnosed using transthoracic echocardiography (TTE) performed for clinical suspicion of pre-tamponade or tamponade. Drainage was undertaken if considered clinically necessary and performed via a sub-xiphoid incision, pericardiocentesis, or via re-sternotomy at the discretion of the treating surgeon. Patient demographics, operative characteristics, anticoagulant exposure, and postoperative outcomes were examined to identify those variables associated with the development of pericardial effusion requiring drainage. Tests of association were performed using the Fischer exact test for dichotomous variables and the Student t-test for continuous variables. Logistic regression models were used to determine univariate predictors of pericardial effusion requiring drainage. Results: Between January 1st, 2017, and December 31st, 2018, a total of 408 patients underwent CABG at our institution, and eight (1.9%) required drainage of pericardial effusion. There was no difference in age, gender, or the proportion of patients on preoperative therapeutic heparin between the study and control groups. Univariate analysis identified preoperative atrial arrhythmia (37.5% vs 8.8%, p = 0.03), reduced left ventricular ejection fraction (47% vs 56%, p = 0.04), longer cardiopulmonary bypass (130 vs 84 min, p < 0.01) and cross-clamp (107 vs 62 min, p < 0.01) times, higher drain output in the first four postoperative hours (420 vs 213 mL, p <0.01), postoperative atrial fibrillation (100% vs 32%, p < 0.01), and pleural effusion requiring drainage (87.5% vs 12.5%, p < 0.01) to be associated with development of pericardial effusion requiring drainage. Conclusion: In this study, the incidence of pericardial effusion requiring drainage was 1.9%. Several factors, mainly related to preoperative or postoperative arrhythmia, length of surgery, and pleural effusion requiring drainage, were identified to be associated with developing clinically significant pericardial effusions. High clinical suspicion and low threshold for transthoracic echo are pertinent to ensure this potentially lethal condition is not missed.

Keywords: coronary artery bypass, pericardial effusion, pericardiocentesis, tamponade, sub-xiphoid drainage

Procedia PDF Downloads 140
24 Retrospective Demographic Analysis of Patients Lost to Follow-Up from Antiretroviral Therapy in Mulanje Mission Hospital, Malawi

Authors: Silas Webb, Joseph Hartland

Abstract:

Background: Long-term retention of patients on ART has become a major health challenge in Sub-Saharan Africa (SSA). In 2010 a systematic review of 39 papers found that 30% of patients were no longer taking their ARTs two years after starting treatment. In the same review, it was noted that there was a paucity of data as to why patients become lost to follow-up (LTFU) in SSA. This project was performed in Mulanje Mission Hospital in Malawi as part of Swindon Academy’s Global Health eSSC. The HIV prevalence for Malawi is 10.3%, one of the highest rates in the world, however prevalence soars to 18% in the Mulanje. Therefore it is essential that patients at risk of being LTFU are identified early and managed appropriately to help them continue to participate in the service. Methodology: All patients on adult antiretroviral formulations at MMH, who were classified as ‘defaulters’ (patients missing a scheduled follow up visit by more than two months) over the last 12 months were included in the study. Demographic varibales were collected from Mastercards for data analysis. A comparison group of patients currently not lost to follow up was created by using all of the patients who attended the HIV clinic between 18th-22nd July 2016 who had never defaulted from ART. Data was analysed using the chi squared (χ²) test, as data collected was categorical, with alpha levels set at 0.05. Results: Overall, 136 patients had defaulted from ART over the past 12 months at MMH. Of these, 43 patients had missing Mastercards, so 93 defaulter datasets were analysed. In the comparison group 93 datasets were also analysed and statistical analysis done using Chi-Squared testing. A higher proportion of men in the defaulting group was noted (χ²=0.034) and defaulters tended to be younger (χ²=0.052). 94.6% of patients who defaulted were taking Tenofovir, Lamivudine and Efavirenz, the standard first line ART therapy in Malawi. The mean length of time on ART was 39.0 months (RR: -22.4-100.4) in the defaulters group and 47.3 months (RR: -19.71-114.23) in the control group, with a mean difference of 8.3 less months in the defaulters group (χ ²=0.056). Discussion: The findings in this study echo the literature, however this review expands on that and shows the demographic for the patient at most risk of defaulting and being LTFU would be: a young male who has missed more than 4 doses of ART and is within his first year of treatment. For the hospital, this data is important at it identifies significant areas for public health focus. For instance, fear of disclosure and stigma may be disproportionately affecting younger men, so interventions can be aimed specifically at them to improve their health outcomes. The mean length of time on medication was 8.3 months less in the defaulters group, with a p-value of 0.056, emphasising the need for more intensive follow-up in the early stages of treatment, when patients are at the highest risk of defaulting.

Keywords: anti-retroviral therapy, ART, HIV, lost to follow up, Malawi

Procedia PDF Downloads 157
23 Circle of Learning Using High-Fidelity Simulators Promoting a Better Understanding of Resident Physicians on Point-of-Care Ultrasound in Emergency Medicine

Authors: Takamitsu Kodama, Eiji Kawamoto

Abstract:

Introduction: Ultrasound in emergency room has advantages of safer, faster, repeatable and noninvasive. Especially focused Point-Of-Care Ultrasound (POCUS) is used daily for prompt and accurate diagnoses, for quickly identifying critical and life-threatening conditions. That is why ultrasound has demonstrated its usefulness in emergency medicine. The true value of ultrasound has been once again recognized in recent years. It is thought that all resident physicians working at emergency room should perform an ultrasound scan to interpret signs and symptoms of deteriorating patients in the emergency room. However, a practical education on ultrasound is still in development. To resolve this issue, we established a new educational program using high-fidelity simulators and evaluated the efficacy of this course. Methods: Educational program includes didactic lectures and skill stations in half-day course. Instructor gives a lecture on POCUS such as Rapid Ultrasound in Shock (RUSH) and/or Focused Assessment Transthoracic Echo (FATE) protocol at the beginning of the course. Then, attendees are provided for training of scanning with cooperation of normal simulated patients. In the end, attendees learn how to apply focused POCUS skills at clinical situation using high-fidelity simulators such as SonoSim® (SonoSim, Inc) and SimMan® 3G (Laerdal Medical). Evaluation was conducted through surveillance questionnaires to 19 attendees after two pilot courses. The questionnaires were focused on understanding course concept and satisfaction. Results: All attendees answered the questionnaires. With respect to the degree of understanding, 12 attendees (number of valid responses: 13) scored four or more points out of five points. High-fidelity simulators, especially SonoSim® was highly appreciated to enhance learning how to handle ultrasound at an actual practice site by 11 attendees (number of valid responses: 12). All attendees encouraged colleagues to take this course because the high level of satisfaction was achieved. Discussion: Newly introduced educational course using high-fidelity simulators realizes the circle of learning to deepen the understanding on focused POCUS by gradual stages. SonoSim® can faithfully reproduce scan images with pathologic findings of ultrasound and provide experimental learning for a growth number of beginners such as resident physicians. In addition, valuable education can be provided if it is used combined with SimMan® 3G. Conclusions: Newly introduced educational course using high-fidelity simulators is supposed to be effective and helps in providing better education compared with conventional courses for emergency physicians.

Keywords: point-of-care ultrasound, high-fidelity simulators, education, circle of learning

Procedia PDF Downloads 253
22 Space Tourism Pricing Model Revolution from Time Independent Model to Time-Space Model

Authors: Kang Lin Peng

Abstract:

Space tourism emerged in 2001 and became famous in 2021, following the development of space technology. The space market is twisted because of the excess demand. Space tourism is currently rare and extremely expensive, with biased luxury product pricing, which is the seller’s market that consumers can not bargain with. Spaceship companies such as Virgin Galactic, Blue Origin, and Space X have been charged space tourism prices from 200 thousand to 55 million depending on various heights in space. There should be a reasonable price based on a fair basis. This study aims to derive a spacetime pricing model, which is different from the general pricing model on the earth’s surface. We apply general relativity theory to deduct the mathematical formula for the space tourism pricing model, which covers the traditional time-independent model. In the future, the price of space travel will be different from current flight travel when space travel is measured in lightyear units. The pricing of general commodities mainly considers the general equilibrium of supply and demand. The pricing model considers risks and returns with the dependent time variable as acceptable when commodities are on the earth’s surface, called flat spacetime. Current economic theories based on the independent time scale in the flat spacetime do not consider the curvature of spacetime. Current flight services flying the height of 6, 12, and 19 kilometers are charging with a pricing model that measures time coordinate independently. However, the emergence of space tourism is flying heights above 100 to 550 kilometers that have enlarged the spacetime curvature, which means tourists will escape from a zero curvature on the earth’s surface to the large curvature of space. Different spacetime spans should be considered in the pricing model of space travel to echo general relativity theory. Intuitively, this spacetime commodity needs to consider changing the spacetime curvature from the earth to space. We can assume the value of each spacetime curvature unit corresponding to the gradient change of each Ricci or energy-momentum tensor. Then we know how much to spend by integrating the spacetime from the earth to space. The concept is adding a price p component corresponding to the general relativity theory. The space travel pricing model degenerates into a time-independent model, which becomes a model of traditional commodity pricing. The contribution is that the deriving of the space tourism pricing model will be a breakthrough in philosophical and practical issues for space travel. The results of the space tourism pricing model extend the traditional time-independent flat spacetime mode. The pricing model embedded spacetime as the general relativity theory can better reflect the rationality and accuracy of space travel on the universal scale. The universal scale from independent-time scale to spacetime scale will bring a brand-new pricing concept for space traveling commodities. Fair and efficient spacetime economics will also bring to humans’ travel when we can travel in lightyear units in the future.

Keywords: space tourism, spacetime pricing model, general relativity theory, spacetime curvature

Procedia PDF Downloads 87
21 Fire Risk Information Harmonization for Transboundary Fire Events between Portugal and Spain

Authors: Domingos Viegas, Miguel Almeida, Carmen Rocha, Ilda Novo, Yolanda Luna

Abstract:

Forest fires along the more than 1200km of the Spanish-Portuguese border are more and more frequent, currently achieving around 2000 fire events per year. Some of these events develop to large international wildfire requiring concerted operations based on shared information between the two countries. The fire event of Valencia de Alcantara (2003) causing several fatalities and more than 13000ha burnt, is a reference example of these international events. Currently, Portugal and Spain have a specific cross-border cooperation protocol on wildfires response for a strip of about 30km (15 km for each side). It is recognized by public authorities the successfulness of this collaboration however it is also assumed that this cooperation should include more functionalities such as the development of a common risk information system for transboundary fire events. Since Portuguese and Spanish authorities use different approaches to determine the fire risk indexes inputs and different methodologies to assess the fire risk, sometimes the conjoint firefighting operations are jeopardized since the information is not harmonized and the understanding of the situation by the civil protection agents from both countries is not unique. Thus, a methodology aiming the harmonization of the fire risk calculation and perception by Portuguese and Spanish Civil protection authorities is hereby presented. The final results are presented as well. The fire risk index used in this work is the Canadian Fire Weather Index (FWI), which is based on meteorological data. The FWI is limited on its application as it does not take into account other important factors with great effect on the fire appearance and development. The combination of these factors is very complex since, besides the meteorology, it addresses several parameters of different topics, namely: sociology, topography, vegetation and soil cover. Therefore, the meaning of FWI values is different from region to region, according the specific characteristics of each region. In this work, a methodology for FWI calibration based on the number of fire occurrences and on the burnt area in the transboundary regions of Portugal and Spain, in order to assess the fire risk based on calibrated FWI values, is proposed. As previously mentioned, the cooperative firefighting operations require a common perception of the information shared. Therefore, a common classification of the fire risk for the fire events occurred in the transboundary strip is proposed with the objective of harmonizing this type of information. This work is integrated in the ECHO project SpitFire - Spanish-Portuguese Meteorological Information System for Transboundary Operations in Forest Fires, which aims the development of a web platform for the sharing of information and supporting decision tools to be used in international fire events involving Portugal and Spain.

Keywords: data harmonization, FWI, international collaboration, transboundary wildfires

Procedia PDF Downloads 224
20 The Commodification of Internet Culture: Online Memes and Differing Perceptions of Their Commercial Uses

Authors: V. Esteves

Abstract:

As products of participatory culture, internet memes represent a global form of interaction with online culture. These digital objects draw upon a rich historical engagement with remix practices that dates back decades: from the copy and paste practices of Dadaism and punk to the re-appropriation techniques of the Situationist International; memes echo a long established form of cultural creativity that pivots on the art of the remix. Online culture has eagerly embraced the changes that the Web 2.0 afforded in terms of making use of remixing as an accessible form of societal expression, bridging these remix practices of the past into a more widely available and accessible platform. Memes embody the idea of 'intercreativity', allowing global creative collaboration to take place through networked digital media; they reflect the core values of participation and interaction that are present throughout much internet discourse whilst also existing in a historical remix continuum. Memes hold the power of cultural symbolism manipulated by global audiences through which societies make meaning, as these remixed digital objects have an elasticity and low literacy level that allows for a democratic form of cultural engagement and meaning-making by and for users around the world. However, because memes are so elastic, their ability to be re-appropriated by other powers for reasons beyond their original intention has become evident. Recently, corporations have made use of internet memes for advertising purposes, engaging in the circulation and re-appropriation of internet memes in commercial spaces – which has, in turn, complicated this relation between online users and memes' democratic possibilities further. By engaging in a widespread online ethnography supplemented by in-depth interviews with meme makers, this research was able to not only track different online meme use through commercial contexts, but it also allowed the possibility to engage in qualitative discussions with meme makers and users regarding their perception and experience of these varying commercial uses of memes. These can be broadly put within two categories: internet memes that are turned into physical merchandise and the use of memes in advertising to sell other (non-meme related) products. Whilst there has been considerable acceptance of the former type of commercial meme use, the use of memes in adverts in order to sell unrelated products has been met with resistance. The changes in reception regarding commercial meme use is dependent on ideas of cultural ownership and perceptions of authorship, ultimately uncovering underlying socio-cultural ideologies that come to the fore within these overlapping contexts. Additionally, this adoption of memes by corporate powers echoes the recuperation process that the Situationist International endured, creating a further link with older remix cultures and their lifecycles.

Keywords: commodification, internet culture, memes, recuperation, remix

Procedia PDF Downloads 98
19 Fast and Non-Invasive Patient-Specific Optimization of Left Ventricle Assist Device Implantation

Authors: Huidan Yu, Anurag Deb, Rou Chen, I-Wen Wang

Abstract:

The use of left ventricle assist devices (LVADs) in patients with heart failure has been a proven and effective therapy for patients with severe end-stage heart failure. Due to the limited availability of suitable donor hearts, LVADs will probably become the alternative solution for patient with heart failure in the near future. While the LVAD is being continuously improved toward enhanced performance, increased device durability, reduced size, a better understanding of implantation management becomes critical in order to achieve better long-term blood supplies and less post-surgical complications such as thrombi generation. Important issues related to the LVAD implantation include the location of outflow grafting (OG), the angle of the OG, the combination between LVAD and native heart pumping, uniform or pulsatile flow at OG, etc. We have hypothesized that an optimal implantation of LVAD is patient specific. To test this hypothesis, we employ a novel in-house computational modeling technique, named InVascular, to conduct a systematic evaluation of cardiac output at aortic arch together with other pertinent hemodynamic quantities for each patient under various implantation scenarios aiming to get an optimal implantation strategy. InVacular is a powerful computational modeling technique that integrates unified mesoscale modeling for both image segmentation and fluid dynamics with the cutting-edge GPU parallel computing. It first segments the aortic artery from patient’s CT image, then seamlessly feeds extracted morphology, together with the velocity wave from Echo Ultrasound image of the same patient, to the computation model to quantify 4-D (time+space) velocity and pressure fields. Using one NVIDIA Tesla K40 GPU card, InVascular completes a computation from CT image to 4-D hemodynamics within 30 minutes. Thus it has the great potential to conduct massive numerical simulation and analysis. The systematic evaluation for one patient includes three OG anastomosis (ascending aorta, descending thoracic aorta, and subclavian artery), three combinations of LVAD and native heart pumping (1:1, 1:2, and 1:3), three angles of OG anastomosis (inclined upward, perpendicular, and inclined downward), and two LVAD inflow conditions (uniform and pulsatile). The optimal LVAD implantation is suggested through a comprehensive analysis of the cardiac output and related hemodynamics from the simulations over the fifty-four scenarios. To confirm the hypothesis, 5 random patient cases will be evaluated.

Keywords: graphic processing unit (GPU) parallel computing, left ventricle assist device (LVAD), lumped-parameter model, patient-specific computational hemodynamics

Procedia PDF Downloads 109
18 Development of a Culturally Safe Wellbeing Intervention Tool for and with the Inuit in Quebec

Authors: Liliana Gomez Cardona, Echo Parent-Racine, Joy Outerbridge, Arlene Laliberté, Outi Linnaranta

Abstract:

Suicide rates among Inuit in Nunavik are six to eleven times larger than the Canadian average. The colonization, religious missions, residential schools as well as economic and political marginalization are factors that have challenged the well-being and mental health of these populations. In psychiatry, screening for mental illness is often done using questionnaires with which the patient is expected to respond how often he/she has certain symptoms. However, the Indigenous view of mental wellbeing may not fit well with this approach. Moreover, biomedical treatments do not always meet the needs of Indigenous peoples because they do not understand the culture and traditional healing methods that persist in many communities. Assess whether the questionnaires used to measure symptoms, commonly used in psychiatry are appropriate and culturally safe for the Inuit in Quebec. Identify the most appropriate tool to assess and promote wellbeing and follow the process necessary to improve its cultural sensitivity and safety for the Inuit population. Qualitative, collaborative, and participatory action research project which respects First Nations and Inuit protocols and the principles of ownership, control, access, and possession (OCAP). Data collection based on five focus groups with stakeholders working with these populations and members of Indigenous communities. Thematic analysis of the data collected and emerging through an advisory group that led a revision of the content, use, and cultural and conceptual relevance of the instruments. The questionnaires measuring psychiatric symptoms face significant limitations in the local indigenous context. We present the factors that make these tools not relevant among Inuit. Although the scale called Growth and Empowerment Measure (GEM) was originally developed among Indigenous in Australia, the Inuit in Quebec found that this tool comprehends critical aspects of their mental health and wellbeing more respectfully and accurately than questionnaires focused on measuring symptoms. We document the process of cultural adaptation of this tool which was supported by community members to create a culturally safe tool that helps in resilience and empowerment. The cultural adaptation of the GEM provides valuable information about the factors affecting wellbeing and contributes to mental health promotion. This process improves mental health services by giving health care providers useful information about the Inuit population and their clients. We believe that integrating this tool in interventions can help create a bridge to improve communication between the Indigenous cultural perspective of the patient and the biomedical view of health care providers. Further work is needed to confirm the clinical utility of this tool in psychological and psychiatric intervention along with social and community services.

Keywords: cultural adaptation, cultural safety, empowerment, Inuit, mental health, Nunavik, resiliency

Procedia PDF Downloads 82
17 Study of Silent Myocardial Ischemia in Type 2 Diabeic Males: Egyptian Experience

Authors: Ali Kassem, Yhea Kishik, Ali Hassan, Mohamed Abdelwahab

Abstract:

Introduction: Accelerated coronary and peripheral vascular atherosclerosis is one of the most common and chronic complications of diabetes mellitus. A recent aspect of coronary artery disease in this condition is its silent nature. The aim of the work: Detection of the prevalence of silent myocardial ischemia (SMI) in Upper Egypt type 2 diabetic males and to select male diabetic population who should be screened for SMI. Patients and methods: 100 type 2 diabetic male patients with a negative history of angina or anginal equivalent symptoms and 30 healthy control were included. Full medical history and thorough clinical examination were done for all participants. Fasting and post prandial blood glucose level, lipid profile, (HbA1c), microalbuminuria, and C-reactive protein were done for all participants Resting ECG, trans-thoracic echocardiography, treadmill exercise ECG, myocardial perfusion imaging were done for all participants and patients positive for one or more NITs were subjected for coronary angiography. Results Twenty nine patients (29%) were positive for one or more NITs in the patients group compared to only one case (3.3%) in the controls. After coronary angiography, 20 patients were positive for significant coronary artery stenosis in the patients group, while it was refused to be done by the patient in the controls. There were statistical significant difference between the two groups regarding, hypertension, dyslipidemia and obesity, family history of DM and IHD with higher levels of microalbuminuria, C-reactive protein, total lipids in patient group versus controls According to coronary angiography, patients were subdivided into two subgroups, 20 positive for SMI (positive for coronary angiography) and 80 negative for SMI (negative for coronary angiography). No statistical difference regarding family history of DM and type of diabetic therapy was found between the two subgroups. Yet, smoking, hypertension, obesity, dyslipidemia and family history of IHD were significantly higher in diabetics positive versus those negative for SMI. 90% of patients in subgroup positive for SMI had two or more cardiac risk factors while only two patients had one cardiac risk factor (10%). Uncontrolled DM was detected more in patients positive for SMI. Diabetic complications were more prevalent in patients positive for SMI versus those negative for SMI. Most of the patients positive for SMI have DM more than 5 years duration. Resting ECG and resting Echo detected only 6 and 11 cases, respectively, of the 20 positive cases in group positive for SMI compared to treadmill exercise ECG and myocardial perfusion imaging that detected 16 and 18 cases respectively, Conclusion: Type 2 diabetic male patients should be screened for detection of SMI when aged above 50 years old, diabetes duration is more than 5 years, presence of two or more cardiac risk factors and/or patients suffering from one or more of the chronic diabetic complications. CRP, is an important parameter for selection of type 2 diabetic male patients who should be screened for SMI. Non invasive cardiac tests are reliable for screening of SMI in these patients in our locality.

Keywords: C-reactive protein, Silent myocardial ischemia, Stress tests, type 2 DM

Procedia PDF Downloads 354
16 Screening of Osteoporosis in Aging Populations

Authors: Massimiliano Panella, Sara Bortoluzzi, Sophia Russotto, Daniele Nicolini, Carmela Rinaldi

Abstract:

Osteoporosis affects more than 200 million people worldwide. About 75% of osteoporosis cases are undiagnosed or diagnosed only when a bone fracture occurs. Since osteoporosis related fractures are significant determinants of the burden of disease and health and social costs of aging populations, we believe that this is the early identification and treatment of high-risk patients should be a priority in actual healthcare systems. Screening for osteoporosis by dual energy x-ray absorptiometry (DEXA) is not cost-effective for general population. An alternative is pulse-echo ultrasound (PEUS) because of the minor costs. To this end, we developed an early detection program for osteoporosis with PEUS, and we evaluated is possible impact and sustainability. We conducted a cross-sectional study including 1,050 people in Italy. Subjects with >1 major or >2 minor risk factors for osteoporosis were invited to PEUS bone mass density (BMD) measurement at the proximal tibia. Based on BMD values, subjects were classified as healthy subjects (BMD>0.783 g/cm²) and pathological including subjects with suspected osteopenia (0.783≤BMD>0.719 g/cm²) or osteoporosis (BMD ≤ 0.719 g/cm²). The responder rate was 60.4% (634/1050). According to the risk, PEUS scan was recommended to 436 people, of whom 300 (mean age 45.2, 81% women) accepted to participate. We identified 240 (80%) healthy and 60 (20%) pathological subjects (47 osteopenic and 13 osteoporotic). We observed a significant association between high risk people and reduced bone density (p=0.043) with increased risks for female gender, older ages, and menopause (p<0.01). The yearly cost of the screening program was 8,242 euros. With actual Italian fracture incidence rates in osteoporotic patients, we can reasonably expect in 20 years that at least 6 fractures will occur in our sample. If we consider that the mean costs per fracture in Italy is today 16,785 euros, we can estimate a theoretical cost of 100,710 euros. According to literature, we can assume that the early treatment of osteoporosis could avoid 24,170 euros of such costs. If we add the actual yearly cost of the treatments to the cost of our program and we compare this final amount of 11,682 euros to the avoidable costs of fractures (24,170 euros) we can measure a possible positive benefits/costs ratio of 2.07. As a major outcome, our study let us to early identify 60 people with a significant bone loss that were not aware of their condition. This diagnostic anticipation constitutes an important element of value for the project, both for the patients, for the preventable negative outcomes caused by the fractures, and for the society in general, because of the related avoidable costs. Therefore, based on our finding, we believe that the PEUS based screening performed could be a cost-effective approach to early identify osteoporosis. However, our study has some major limitations. In fact, in our study the economic analysis is based on theoretical scenarios, thus specific studies are needed for a better estimation of the possible benefits and costs of our program.

Keywords: osteoporosis, prevention, public health, screening

Procedia PDF Downloads 96
15 Deficient Multisensory Integration with Concomitant Resting-State Connectivity in Adult Attention Deficit/Hyperactivity Disorder (ADHD)

Authors: Marcel Schulze, Behrem Aslan, Silke Lux, Alexandra Philipsen

Abstract:

Objective: Patients with Attention Deficit/Hyperactivity Disorder (ADHD) often report that they are being flooded by sensory impressions. Studies investigating sensory processing show hypersensitivity for sensory inputs across the senses in children and adults with ADHD. Especially the auditory modality is affected by deficient acoustical inhibition and modulation of signals. While studying unimodal signal-processing is relevant and well-suited in a controlled laboratory environment, everyday life situations occur multimodal. A complex interplay of the senses is necessary to form a unified percept. In order to achieve this, the unimodal sensory modalities are bound together in a process called multisensory integration (MI). In the current study we investigate MI in an adult ADHD sample using the McGurk-effect – a well-known illusion where incongruent speech like phonemes lead in case of successful integration to a new perceived phoneme via late top-down attentional allocation . In ADHD neuronal dysregulation at rest e.g., aberrant within or between network functional connectivity may also account for difficulties in integrating across the senses. Therefore, the current study includes resting-state functional connectivity to investigate a possible relation of deficient network connectivity and the ability of stimulus integration. Method: Twenty-five ADHD patients (6 females, age: 30.08 (SD:9,3) years) and twenty-four healthy controls (9 females; age: 26.88 (SD: 6.3) years) were recruited. MI was examined using the McGurk effect, where - in case of successful MI - incongruent speech-like phonemes between visual and auditory modality are leading to a perception of a new phoneme. Mann-Whitney-U test was applied to assess statistical differences between groups. Echo-planar imaging-resting-state functional MRI was acquired on a 3.0 Tesla Siemens Magnetom MR scanner. A seed-to-voxel analysis was realized using the CONN toolbox. Results: Susceptibility to McGurk was significantly lowered for ADHD patients (ADHDMdn:5.83%, ControlsMdn:44.2%, U= 160.5, p=0.022, r=-0.34). When ADHD patients integrated phonemes, reaction times were significantly longer (ADHDMdn:1260ms, ControlsMdn:582ms, U=41.0, p<.000, r= -0.56). In functional connectivity medio temporal gyrus (seed) was negatively associated with primary auditory cortex, inferior frontal gyrus, precentral gyrus, and fusiform gyrus. Conclusion: MI seems to be deficient for ADHD patients for stimuli that need top-down attentional allocation. This finding is supported by stronger functional connectivity from unimodal sensory areas to polymodal, MI convergence zones for complex stimuli in ADHD patients.

Keywords: attention-deficit hyperactivity disorder, audiovisual integration, McGurk-effect, resting-state functional connectivity

Procedia PDF Downloads 96
14 Evaluation of Trabectedin Safety and Effectiveness at a Tertiary Cancer Center at Qatar: A Retrospective Analysis

Authors: Nabil Omar, Farah Jibril, Oraib Amjad

Abstract:

Purpose: Trabecatine is a is a potent marine-derived antineoplastic drug which binds to the minor groove of the DNA, bending DNA towards the major groove resulting in a changed conformation that interferes with several DNA transcription factors, repair pathways and cell proliferation. Trabectedin was approved by the European Medicines Agency (EMA; London, UK) for the treatment of adult patients with advanced stage soft tissue sarcomas in whom treatment with anthracyclines and ifosfamide has failed, or for those who are not candidates for these therapies. The recommended dosing regimen is 1.5 mg/m2 IV over 24 hours every 3 weeks. The purpose of this study was to comprehensively review available data on the safety and efficacy of trabectedin used as indicated for patients at a Tertiary Cancer Center at Qatar. Methods: A medication administration report generated in the electronic health record identified all patients who received trabectedin between November 1, 2015 and November 1, 2017. This retrospective chart review evaluated the indication of trabectedin use, compliance to administration protocol and the recommended monitoring parameters, number of patients improved on the drug and continued treatment, number of patients discontinued treatment due to side-effects and the reported side effects. Progress and discharged notes were utilized to report experienced side effects during trabectedin therapy. A total of 3 patients were reviewed. Results: Total of 2 out of 3 patients who received trabectedin were receiving it for non-FDA and non-EMA, approved indications; metastatic rhabdomyosarcoma and ovarian cancer stage IV with poor prognosis. And only one patient received it as indicated for leiomyosarcoma of left ureter with metastases to liver, lungs and bone. None of the patients has continued the therapy due to development of serious side effects. One patient had stopped the medication after one cycle due to disease progression and transient hepatic toxicity, the other one had disease progression and developed 12 % reduction in LVEF after 12 cycles of trabectedin, and the third patient deceased, had disease progression on trabectedin after the 10th cycle that was received through peripheral line which resulted in developing extravasation and left arm cellulitis requiring debridement. Regarding monitoring parameters, at baseline the three patients had ECHO, and Creatine Phosphokinase (CPK) but it was not monitored during treatment as recommended. Conclusion: Utilizing this medication as indicated with performing the appropriate monitoring parameters as recommended can benefit patients who are receiving it. It is important to reinforce the intravenous administration via central intravenous line, the re-assessment of left ventricular ejection fraction (LVEF) by echocardiogram or multigated acquisition (MUGA) scan at 2- to 3-month intervals thereafter until therapy is discontinued, and CPK and LFTs levels prior to each administration of trabectedin.

Keywords: trabectedin, drug-use evaluation, safety, effectiveness, adverse drug reaction, monitoring

Procedia PDF Downloads 108
13 Risks beyond Cyber in IoT Infrastructure and Services

Authors: Mattias Bergstrom

Abstract:

Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.

Keywords: IoT, security, infrastructure, SCADA, blockchain, AI

Procedia PDF Downloads 69
12 Comparative Investigation of Two Non-Contact Prototype Designs Based on a Squeeze-Film Levitation Approach

Authors: A. Almurshedi, M. Atherton, C. Mares, T. Stolarski, M. Miyatake

Abstract:

Transportation and handling of delicate and lightweight objects is currently a significant issue in some industries. Two common contactless movement prototype designs, ultrasonic transducer design and vibrating plate design, are compared. Both designs are based on the method of squeeze-film levitation, and this study aims to identify the limitations, and challenges of each. The designs are evaluated in terms of levitation capabilities, and characteristics. To this end, theoretical and experimental explorations are made. It is demonstrated that the ultrasonic transducer prototype design is better suited to the terms of levitation capabilities. However, the design has some operating and mechanical designing difficulties. For making accurate industrial products in micro-fabrication and nanotechnology contexts, such as semiconductor silicon wafers, micro-components and integrated circuits, non-contact oil-free, ultra-precision and low wear transport along the production line is crucial for enabling. One of the designs (design A) is called the ultrasonic chuck, for which an ultrasonic transducer (Langevin, FBI 28452 HS) comprises the main part. Whereas the other (design B), is a vibrating plate design, which consists of a plain rectangular plate made of Aluminium firmly fastened at both ends. The size of the rectangular plate is 200x100x2 mm. In addition, four rounded piezoelectric actuators of size 28 mm diameter with 0.5 mm thickness are glued to the underside of the plate. The vibrating plate is clamped at both ends in the horizontal plane through a steel supporting structure. In addition, the dynamic of levitation using the designs (A and B) has been investigated based on the squeeze film levitation (SFL). The input apparatus that is used with designs consist of a sine wave signal generator connected to an amplifier type ENP-1-1U (Echo Electronics). The latter has to be utilised to magnify the sine wave voltage that is produced by the signal generator. The measurements of the maximum levitation for three different semiconductor wafers of weights 52, 70 and 88 [g] for design A are 240, 205 and 187 [um], respectively. Whereas the physical results show that the average separation distance for a disk of 5 [g] weight for design B reaches 70 [um]. By using the methodology of squeeze film levitation, it is possible to hold an object in a non-contact manner. The analyses of the investigation outcomes signify that the non-contact levitation of design A provides more improvement than design B. However, design A is more complicated than design B in terms of its manufacturing. In order to identify an adequate non-contact SFL design, a comparison between two common such designs has been adopted for the current investigation. Specifically, the study will involve making comparisons in terms of the following issues: floating component geometries and material type constraints; final created pressure distributions; dangerous interactions with the surrounding space; working environment constraints; and complication and compactness of the mechanical design. Considering all these matters is essential for proficiently distinguish the better SFL design.

Keywords: ANSYS, floating, piezoelectric, squeeze-film

Procedia PDF Downloads 122
11 Leveraging Multimodal Neuroimaging Techniques to in vivo Address Compensatory and Disintegration Patterns in Neurodegenerative Disorders: Evidence from Cortico-Cerebellar Connections in Multiple Sclerosis

Authors: Efstratios Karavasilis, Foteini Christidi, Georgios Velonakis, Agapi Plousi, Kalliopi Platoni, Nikolaos Kelekis, Ioannis Evdokimidis, Efstathios Efstathopoulos

Abstract:

Introduction: Advanced structural and functional neuroimaging techniques contribute to the study of anatomical and functional brain connectivity and its role in the pathophysiology and symptoms’ heterogeneity in several neurodegenerative disorders, including multiple sclerosis (MS). Aim: In the present study, we applied multiparametric neuroimaging techniques to investigate the structural and functional cortico-cerebellar changes in MS patients. Material: We included 51 MS patients (28 with clinically isolated syndrome [CIS], 31 with relapsing-remitting MS [RRMS]) and 51 age- and gender-matched healthy controls (HC) who underwent MRI in a 3.0T MRI scanner. Methodology: The acquisition protocol included high-resolution 3D T1 weighted, diffusion-weighted imaging and echo planar imaging sequences for the analysis of volumetric, tractography and functional resting state data, respectively. We performed between-group comparisons (CIS, RRMS, HC) using CAT12 and CONN16 MATLAB toolboxes for the analysis of volumetric (cerebellar gray matter density) and functional (cortico-cerebellar resting-state functional connectivity) data, respectively. Brainance suite was used for the analysis of tractography data (cortico-cerebellar white matter integrity; fractional anisotropy [FA]; axial and radial diffusivity [AD; RD]) to reconstruct the cerebellum tracts. Results: Patients with CIS did not show significant gray matter (GM) density differences compared with HC. However, they showed decreased FA and increased diffusivity measures in cortico-cerebellar tracts, and increased cortico-cerebellar functional connectivity. Patients with RRMS showed decreased GM density in cerebellar regions, decreased FA and increased diffusivity measures in cortico-cerebellar WM tracts, as well as a pattern of increased and mostly decreased functional cortico-cerebellar connectivity compared to HC. The comparison between CIS and RRMS patients revealed significant GM density difference, reduced FA and increased diffusivity measures in WM cortico-cerebellar tracts and increased/decreased functional connectivity. The identification of decreased WM integrity and increased functional cortico-cerebellar connectivity without GM changes in CIS and the pattern of decreased GM density decreased WM integrity and mostly decreased functional connectivity in RRMS patients emphasizes the role of compensatory mechanisms in early disease stages and the disintegration of structural and functional networks with disease progression. Conclusions: In conclusion, our study highlights the added value of multimodal neuroimaging techniques for the in vivo investigation of cortico-cerebellar brain changes in neurodegenerative disorders. An extension and future opportunity to leverage multimodal neuroimaging data inevitably remain the integration of such data in the recently-applied mathematical approaches of machine learning algorithms to more accurately classify and predict patients’ disease course.

Keywords: advanced neuroimaging techniques, cerebellum, MRI, multiple sclerosis

Procedia PDF Downloads 117
10 A Second Chance to Live and Move: Lumbosacral Spinal Cord Ischemia-Infarction after Cardiac Arrest and the Artery of Adamkiewicz

Authors: Anna Demian, Levi Howard, L. Ng, Leslie Simon, Mark Dragon, A. Desai, Timothy Devlantes, W. David Freeman

Abstract:

Introduction: Out-of-hospital cardiac arrest (OHCA) can carry a high mortality. For survivors, the most common complication is hypoxic-ischemic brain injury (HIBI). Rarely, lumbosacral spinal cord and/or other spinal cord artery ischemia can occur due to anatomic variation and variable mean arterial pressure after the return of spontaneous circulation. We present a case of an OHCA survivor who later woke up with bilateral leg weakness with preserved sensation (ASIA grade B, L2 level). Methods: We describe a clinical, radiographic, and laboratory presentation, as well as a National Library of Medicine (NLM) search engine methodology, characterizing incidence/prevalence of this entity is discussed. A 70-year-old male, a longtime smoker, and alcohol user, suddenly collapsed at a bar surrounded by friends. He had complained of chest pain before collapsing. 911 was called. EMS arrived, and the patient was in pulseless electrical activity (PEA), cardiopulmonary resuscitation (CPR) was initiated, and the patient was intubated, and a LUCAS device was applied for continuous, high-quality CPR in the field by EMS. In the ED, central lines were placed, and thrombolysis was administered for a suspected Pulmonary Embolism (PE). It was a prolonged code that lasted 90 minutes. The code continued with the eventual return of spontaneous circulation. The patient was placed on an epinephrine and norepinephrine drip to maintain blood pressure. ECHO was performed and showed a “D-shaped” ventricle worrisome for PE as well as an ejection fraction around 30%. A CT with PE protocol was performed and confirmed bilateral PE. Results: The patient woke up 24 hours later, following commands, and was extubated. He was found paraplegic below L2 with preserved sensation, with hypotonia and areflexia consistent with “spinal shock” or anterior spinal cord syndrome. MRI thoracic and lumbar spine showed a conus medullaris level spinal cord infarction. The patient was given IV steroids upon initial discovery of cord infarct. NLM search using “cardiac arrest” and “spinal cord infarction” revealed 57 results, with only 8 review articles. Risk factors include age, atherosclerotic disease, and intraaortic balloon pump placement. AoA (Artery of Adamkiewicz) anatomic variation along with existing atherosclerotic factors and low perfusion were also known risk factors. Conclusion: Acute paraplegia from anterior spinal cord infarction of the AoA territory after cardiac arrest is rare. Larger prospective, multicenter trials are needed to examine potential interventions of hypothermia, lumbar drains, which are sometimes used in aortic surgery to reduce ischemia and/or other neuroprotectants.

Keywords: cardiac arrest, spinal cord infarction, artery of Adamkiewicz, paraplegia

Procedia PDF Downloads 168
9 Iron-Metal-Organic Frameworks: Potential Application as Theranostics for Inhalable Therapy of Tuberculosis

Authors: Gabriela Wyszogrodzka, Przemyslaw Dorozynski, Barbara Gil, Maciej Strzempek, Bartosz Marszalek, Piotr Kulinowski, Wladyslaw Piotr Weglarz, Elzbieta Menaszek

Abstract:

MOFs (Metal-Organic Frameworks) belong to a new group of porous materials with a hybrid organic-inorganic construction. Their structure is a network consisting of metal cations or clusters (acting as metallic centers, nodes) and the organic linkers between nodes. The interest in MOFs is primarily associated with the use of their well-developed surface and large porous. Possibility to build MOFs of biocompatible components let to use them as potential drug carriers. Furthermore, forming MOFs structure from cations possessing paramagnetic properties (e.g. iron cations) allows to use them as MRI (Magnetic Resonance Imaging) contrast agents. The concept of formation of particles that combine the ability to transfer active substance with imaging properties has been called theranostic (from words combination therapy and diagnostics). By building MOF structure from iron cations it is possible to use them as theranostic agents and monitoring the distribution of the active substance after administration in real time. In the study iron-MOF: Fe-MIL-101-NH2 was chosen, consisting of iron cluster in nodes of the structure and amino-terephthalic acid as a linker. The aim of the study was to investigate the possibility of applying Fe-MIL-101-NH2 as inhalable theranostic particulate system for the first-line anti-tuberculosis antibiotic – isoniazid. The drug content incorporated into Fe-MIL-101-NH2 was evaluated by dissolution study using spectrophotometric method. Results showed isoniazid encapsulation efficiency – ca. 12.5% wt. Possibility of Fe-MIL-101-NH2 application as the MRI contrast agent was demonstrated by magnetic resonance tomography. FeMIL-101-NH2 effectively shortening T1 and T2 relaxation times (increasing R1 and R2 relaxation rates) linearly with the concentrations of suspended material. Images obtained using multi-echo magnetic resonance imaging sequence revealed possibility to use FeMIL-101-NH2 as positive and negative contrasts depending on applied repetition time. MOFs micronization via ultrasound was evaluated by XRD, nitrogen adsorption, FTIR, SEM imaging and did not influence their crystal shape and size. Ultrasonication let to break the aggregates and achieve very homogeneously looking SEM images. MOFs cytotoxicity was evaluated in in vitro test with a highly sensitive resazurin based reagent PrestoBlue™ on L929 fibroblast cell line. After 24h no inhibition of cell proliferation was observed. All results proved potential possibility of application of ironMOFs as an isoniazid carrier and as MRI contrast agent in inhalatory treatment of tuberculosis. Acknowledgments: Authors gratefully acknowledge the National Science Center Poland for providing financial support, grant no 2014/15/B/ST5/04498.

Keywords: imaging agents, metal-organic frameworks, theranostics, tuberculosis

Procedia PDF Downloads 218
8 Yu Kwang-Chung vs. Yu Kwang-Chung: Untranslatability as the Touchstone of a Poet

Authors: Min-Hua Wu

Abstract:

The untranslatability of an established poet’s tour de force is thoroughly explored by Matthew Arnold (1822-1888). In his On Translating Homer (1861), Arnold lists the four most striking poetic qualities of Homer, namely his rapidity, plainness and directness of style and diction, plainness and directness of ideas, and nobleness. He concludes that such celebrated English translators as Cowper, Pope, Chapman, and Mr. Newman are all doomed, due to their respective failure in rendering the totality of the four Homeric poetic qualities. Why poetic translation always amounts to being proven such a mission impossible for the translator? According to Arnold, it is because there constantly exists a mist interposed between the translator’s own literary self-obsession and the objective artistic qualities that reside in the work of the original author. Foregrounding such a seemingly empowering yet actually detrimental poetic mist, he explains why the aforementioned translators fail in their attempts to bring the Homeric charm to the British reader. Drawing on Arnold’s analytical study on Homeric translation, the research attempts to bring Yu Kwang-chung the poet vis-à-vis Yu Kwang-chung the translator, with an aim not so much to find any similar mist as revealed by Arnold between his Chinese poetry and English translation as to probe into a latent and veiled literary and lingual mist interposed between Chinese and English, if not between Chinese and English literatures. The major work studied and analyzed for this study is Yu’s own Chinese poetry and his own English translation collected in The Night Watchman: Yu Kwang-chung 1958-2004. The research argues that the following critical elements that characterizes Yu’s poetics are to a certain extent 'transformed,' if not 'lost,' in his English translation: a. the Chinese pictographic and ideographic unit terms which so unfailingly characterize the poet’s incredible creativity, allowing him to habitually and conveniently coin concrete textual images or word-scapes almost at his own will; b. the subtle wordplay and punning which appear at a reasonable frequency; c. the parallel contrastive repetitive syntactic structure within a single poetic line; d. the ambiguous and highly associative diction in the adjective and noun categories; e. the literary allusion that harks back to the old times of Chinese literature; f. the alliteration that adds rhythm and smoothness to the lines; g. the rhyming patterns that bring about impressive sonority and lingering echo to the ears of the reader; h. the grandeur-imposing and sublimity-arousing word-scaping which hinges on the employment of verbs; i. the meandering cultural heritage that embraces such elements as Chinese medicine and kung fu; and j. other features of the like. Once we appeal to the Arnoldian tribunal and resort to the strict standards of such a Victorian cultural and literary critic who insists 'to see the object as in itself it really is,' we may serve as a potential judge for the tug of war between Yu Kwang-chung the poet and Yu Kwang-chung the translator, a tug of war that will not merely broaden our understating of Chinese poetics but deepen our apprehension of Chinese-English translatology.

Keywords: Yu Kwang-chung, The Night Watchman, poetry translation, Chinese-English translation, translation studies, Matthew Arnold

Procedia PDF Downloads 358