Search results for: severe acute respiratory distress syndrome
598 Rainwater Management in Smart City: Focus in Gomti Nagar Region, Lucknow, Uttar Pradesh, India
Authors: Priyanka Yadav, Rajkumar Ghosh, Alok Saini
Abstract:
Human civilization cannot exist and thrive in the absence of adequate water. As a result, even in smart cities, water plays an important role in human existence. The key causes of this catastrophic water scarcity crisis are lifestyle changes, over-exploitation of groundwater, water over usage, rapid urbanization, and uncontrolled population growth. Furthermore, salty water seeps into deeper aquifers, causing land subsidence. The purpose of this study on artificial groundwater recharge is to address the water shortage in Gomti Nagar, Lucknow. Submersibles are the most common methods of collecting freshwater from groundwater in Gomti Nagar neighbourhood of Lucknow. Gomti Nagar area has a groundwater depletion rate of 1968 m3/day/km2 and is categorized as Zone-A (very high levels) based on the existing groundwater abstraction pattern - A to D. Harvesting rainwater using roof top rainwater harvesting systems (RTRWHs) is an effective method for reducing aquifer depletion in a sustainable water management system. Rainwater collecting using roof top rainwater harvesting systems (RTRWHs) is an effective method for reducing aquifer depletion in a sustainable water conservation system. Due to a water imbalance of 24519 ML/yr, the Gomti Nagar region is facing severe groundwater depletion. According to the Lucknow Development Authority (LDA), the impact of installed RTRWHs (plot area 300 sq. m.) is 0.04 percent of rainfall collected through RTRWHs in Gomti Nagar region of Lucknow. When RTRWHs are deployed in all buildings, their influence will be greater. Bye-laws in India have mandated the installation of RTRWHs on plots greater than 300 sq.m. A better India without any water problem is a pipe dream that may be realized by installing residential and commercial rooftop rainwater collecting systems in every structure. According to the current study, RTRWHs should be used as an alternate source of water to bridge the gap between groundwater recharge and extraction in smart city viz. Gomti Nagar, Lucknow, India.Keywords: groundwater recharge, RTRWHs, harvested rainwater, rainfall, water extraction
Procedia PDF Downloads 106597 Ophthalmic Self-Medication Practices and Associated Factors among Adult Ophthalmic Patients
Authors: Sarah Saad Alamer, Shujon Mohammed Alazzam, Amjad Khater Alanazi, Mohamed Ahmed Sankari, Jana Sameer Sendy, Saleh Al-Khaldi, Khaled Allam, Amani Badawi
Abstract:
Background: Self-medication is defined as the selection of medicines by individuals to treat self-diagnosed. There are a lot of concerns about the safety of long-term use of nonprescription ophthalmic drugs, which may lead to a variety of serious ocular complications. Topical steroids can produce severe eye-threatening complications, including the elevation of intraocular pressure (IOP) with possible development of glaucoma and infrequent optic nerve damage. In recent times, many OTC ophthalmic preparations have been possible without a prescription. Objective: In our study, we aimed to determine the prevalence of self-medication ocular topical steroid practice and associated factors among adult ophthalmic patients attending King Saud medical city. Methods: This study was conducted as a cross-sectional study, targeting participants aged 18 years old or above who had used topical steroids eye drops to determine the prevalence of self-medication ocular topical steroid practice and associated factors among adult patients attending ophthalmology clinic in King Saud Medical City (KSMC) in the central region. Results: A total of 308 responses, 92(29.8%) were using ocular topical, 58(18.8%) with prescription, 5(1.6%) without prescription, 29(9.4%) with and without prescription while 216(70.1%) did not use it. The frequency of using ocular topical steroids without a prescription among participants was 11(12%) once and 33 (35%) many times. 26(28.3%) were having complication, mostly 11(12.4%) eye infection, 8(9%) Glaucoma, 6 (6.7%) Cataracts. Reasons for self-medication ocular topical steroid practice among participants were 14 (15.2%) repeated symptoms, 11(15.2%) had heard an advice from a friend, 11 (15.2%) thought they had enough knowledge. Conclusion: Our study reveals that, even though detecting a high level of knowledge and acceptable practices and attitudes among participants, the incidence of self-medication with steroid eye drops was observed. This practice is mainly due to participants having repeated symptoms and thinking they have enough knowledge. Increasing the education level of patients on self-medication steroid eye drops practice and it is associated complications would help reduce the incidence of self-medication steroid eye drops practice.Keywords: self-medication, ophthalmic medicine, steroid eye drop, over the counter
Procedia PDF Downloads 89596 Effects of Dietary Polyunsaturated Fatty Acids and Beta Glucan on Maturity, Immunity and Fry Quality of Pabdah Catfish, Ompok pabda
Authors: Zakir Hossain, Md. Saddam Hossain
Abstract:
A nutritionally balanced diet and selection of appropriate species are important criteria in aquaculture. The present study was conducted to evaluate the effects of polyunsaturated fatty acids (PUFAs) and beta glucan containing diet on growth performance, feed utilization, maturation, immunity, early embryonic and larval development of endangered Pabdah catfish, Ompok pabda. In this study, squid extracted lipids and mushroom powder were used as the source of PUFAs and beta glucan, respectively, and formulated two isonitrogenous diets such as basal or control (CON) diet and treated (PBG) diet with maintaining 30% protein levels. During the study period, similar physicochemical conditions of water such as temperature, pH, and dissolved oxygen (DO) were 26.5±2 °C, 7.4±0.2, and 6.7±0.5 ppm, respectively in each cistern. The results showed that final mean body weight, final mean length gain, food conversion ratio (FCR), specific growth rate (SGR), food conversion efficiency (%), hepatosomatic index (HSI), kidney index (KI), and viscerosomatic index (VSI) were significantly (P<0.01 and P<0.05) higher in fish fed the PBG diet than that of fish fed the CON diet. The length-weight relationship and relative condition factor (K) of O. pabda were significantly (P<0.05) affected by the PBG diet. The gonadosomatic index (GSI), sperm viability, blood serum calcium ion concentrations (Ca²⁺), and vitellogenin level were significantly (P<0.05) higher in fish fed the PBG diet than that of fish fed the CON diet; which was used to the indication of fish maturation. During the spawning season, lipid granules and normal morphological structure were observed in the treated fish liver, whereas fewer lipid granules of liver were observed in the control group. Based on the immunity and stress resistance-related parameters such as hematological indices, antioxidant activity, lysozyme level, respiratory burst activity, blood reactive oxygen species (ROS), complement activity (ACH50 assay), specific IgM, brain AChE, plasma PGOT, and PGPT enzyme activity were significantly (P<0.01 and P<0.05) higher in fish fed the PBG diet than that of fish fed the CON diet. The fecundity, fertilization rate (92.23±2.69%), hatching rate (87.43±2.17 %) and survival (76.62±0.82%) of offspring were significantly higher (P˂0.05) in the PBG diet than in the control. Consequently, early embryonic and larval development was better in PBG treated group than in the control. Therefore, the present study showed that the polyunsaturated fatty acids (PUFAs) and beta glucan enriched experimental diet were more effective and achieved better growth, feed utilization, maturation, immunity, and spawning performances of O. pabda.Keywords: polyunsaturated fatty acids, beta glucan, maturity, immunity, catfish
Procedia PDF Downloads 2595 Delusional Parasitosis (A Rare Primary Psychiatric Diagnosis)
Authors: Jaspinder Kaur, Jatinder Pal Singh
Abstract:
Introduction- Delusional parasitosis is a rare psychotic illness characterized by a fixed belief of manifesting a parasite in a body when in reality, it was not. Also known as Ekbom syndrome or delusional infestations, or acarophobia. Although the patient has no primary skin pathology, but all skin findings were secondary to skin manipulation by the patient itself, which is why up to 90% of patients first seek consultation from a dermatologist. Most commonly, it was seen in older people with female to male ratio is 2:1. For treatment, the patient first need to be investigated to rule all other possible causes, as Delusional parasitosis can be caused by Vitamin B12 deficiency, pellagra, hepatic and renal disease, diabetes mellitus, multiple sclerosis, and leprosy. When all possible causes ruled out, psychiatric referral to be done. Rule out other psychiatric comorbidities, and treatment should be done accordingly. Patient with delusional parasitosis responds well to second generation antipsychotics and need to continuous medication over years, and relapse is likely if treatment is stopped. Case Presentation- A 79-year-old female, belonging to lower socio-economic status, presented with complaints of itching sensation with erythematous patches over the scalp and multiple scratch excoriations lesion over the scalp, face and neck from the past 7-8 months. She had a feeling of small insect crawling under her skin and scalp area. To reduce the itching and kill the insect, she would scratch and squeeze her skin repeatedly. When the family tried to give her explanation that there was no insect in her body, she would not get convinced, rather got angry and abuse family members for not believing her. Gradually, her sleep would remain disturbed, she would be seen awake at night, seen to be scratching her skin, pull her scalp hair, even squeeze out her healed lesions. She collected her skin debris, scalp hairs and look out for insect. Because of her continuous illness, the patient started to remain sad and had crying spells. Her appetite decreased. She became socially isolated and stopped doing her activities of daily living. Family member’s first consulted dermatologist, investigated thoroughly with routine investigations, autoimmune and malignancy workup. As all investigations were normal, following which patient was referred for psychiatric evaluation. The patient was started on Tablet Olanzapine 2.5 mg, gradually increased to 7.5 mg. Over 1 month, there was reduction in itching, skin pricking. Lesions were gradually healed, and the patient continued to take other dermatological medications and ointment and was in regular follow up with psychiatric liaison from past 2 months with 70-80 % improvement in her symptoms. Conclusion- Delusional parasitosis is a psychiatric disorder of insidious onset, seen commonly in middle and old age people. Both psychiatric and dermatology consultation liaison will help the patient for an early diagnosis and adequate treatment. If a primary psychiatric diagnosis, the patient respond well to second generation antipsychotics but always require a further evaluation and treatment management if it is secondary to some physical or other psychiatric comorbidity.Keywords: delusional parasitosis, delusional infestations, rare, primary psychiatric diagnosis, antipsychotic agents
Procedia PDF Downloads 82594 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals
Authors: Christine F. Boos, Fernando M. Azevedo
Abstract:
Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing
Procedia PDF Downloads 528593 Importance of Detecting Malingering Patients in Clinical Setting
Authors: Sakshi Chopra, Harsimarpreet Kaur, Ashima Nehra
Abstract:
Objectives: Malingering is fabricating or exaggerating the symptoms of mental or physical disorders for a variety of secondary gains or motives, which may include financial compensation; avoiding work; getting lighter criminal sentences; or simply to attract attention or sympathy. Malingering is different from somatization disorder and factitious disorder. The prevalence of malingering is unknown and difficult to determine. In an estimated study in forensic population, it can reach up to 17% cases. But the accuracy of such estimates is questionable as successful malingerers are not detected and thus, not included. Methods: The case study of a 58 years old, right handed, graduate, pre-morbidly working in a national company with reported history of stroke leading to head injury; cerebral infarction/facial palsy and dementia. He was referred for disability certification so that his job position can be transferred to his son as he could not work anymore. A series of Neuropsychological tests were administered. Results: With a mental age of < 2.5 years; social adaptive functioning was overall < 20 showing profound Mental Retardation, less than 1 year social age in abilities of self-help, eating, dressing, locomotion, occupation, communication, self-direction, and socialization; severely impaired verbal and performance ability, 96% impairment in Activities of Daily Living, with an indication of very severe depression. With inconsistent and fluctuating medical findings and problem descriptions to different health professionals forming the board for his disability, it was concluded that this patient was malingering. Conclusions: Even though it can be easily defined, malingering can be very challenging to diagnosis. Cases of malingering impose a substantial economic burden on the health care system and false attribution of malingering imposes a substantial burden of suffering on a significant proportion of the patient population. Timely, tactful diagnosis and management can help ease this patient burden on the healthcare system. Malingering can be detected by only trained mental health professionals in the clinical setting.Keywords: disability, India, malingering, neuropsychological assessment
Procedia PDF Downloads 419592 A Review of Evidence on the Use of Digital Healthcare Interventions to Provide Follow-Up Care for Coeliac Disease Patients
Abstract:
Background: Coeliac Disease affects around 1 in 100 people. Untreated, it can result in serious morbidity such as malabsorption and cancers. The only treatment is to adhere to a gluten free diet (GFD). International guidelines recommend that people with the coeliac disease receive follow-up healthcare annually to detect complications early and support their adherence to a GFD. However, there is a finite amount of healthcare in the UK, and as such, not all patients receive follow-up care as recommended by the guidelines. Furthermore, there is an increasing number of patients being diagnosed with coeliac disease. Given the potential severe morbidity that non-adherence to a GFD could result in, alongside reports that the rate of non- GFD adherence could be as high as 91%, it is imperative that action is taken. One potential solution to this would be to provide follow-up care digitally through utilising technology. This abstract reports on a rapid review undertaken to explore the existing evidence in this area. Methods: In June 2020, 11 bibliographic databases were searched to find any pertinent studies. The inclusion criteria required the study to be written in the English language and report on the use of digital healthcare interventions for people with Coeliac Disease. Results: A small amount of evidence (n=8) was found which met our inclusion criteria and pertained to the provision of CD follow-up digitally. These studies focussed either on educating and supporting patients to adhere to a GFD or providing consultation remotely with a focus on detecting complications early. These studies showed that there is potential for digital healthcare interventions to positively impact people with coeliac disease. However, it is suggested that the effectiveness of these interventions may depend on local circumstances, individual knowledge of CD and general attitudes. Conclusion: The above studies suggest that providing follow-up care digitally may offer a potential solution; however, the evidence about how this should be done and in what circumstances this will work for individuals is scarce. In the light of the COVID-19 pandemic, the introduction of digital healthcare interventions appears to be highly topical, and as such, this review may benefit from being refreshed in the future.Keywords: coeliac disease, follow-up, gluten free diet, digital healthcare interventions
Procedia PDF Downloads 174591 Poster for Sickle Cell Disease and Barriers to Care in South Yorkshire from 2017 to 2023
Authors: Amardass Dhami, Clare Samuelson
Abstract:
Background: Sickle cell disease (SCD) is a complex, multisystem condition that significantly impacts patients' quality of life, characterized by acute illness episodes, progressive organ damage, and reduced life expectancy. In the UK, over 13,000 individuals are affected, with South Yorkshire having the fifth highest prevalence, including approximately 800 patients. Retinal complications in SCD can manifest as either proliferative or non-proliferative disease, with proliferative changes being more prevalent. These retinal issues can cause significant morbidity, including visual loss and increased care requirements, underscoring the need for regular monitoring. An integrated approach was applied to ensure timely interventions, ultimately enhancing patient outcomes and reduce ‘did not attend’ rates. Aim: To assess the factors which may influence attendance to Haematology and Ophthalmology Clinics with attention towards levels of deprivation towards non-attendance. Method : A retrospective study on 84 eligible patients, from the regional tertiary Centre for Sickle Cell Care (Sheffield Teaching Hospital) from 2017 to 2023. The study focused on the incidence of sickle cell eye disease, specifically examining the outcomes of patients who attended the combined haematology and ophthalmology clinics. Patients who did not attend either clinic were excluded from the analysis to ensure a clear understanding of the combined clinic's impact. This data was then compared with the United Kingdom’s Index of Multiple Deprivation (IMD) datasets to assess if inequalities of care affected this population. Results: The study concluded that the effectiveness of combining haematology and ophthalmology clinics was reduced following the intervention. The DNA rates increased to 40% for the haematology clinic. Additionally, a significant proportion of the cohort was classified as residing in areas of deprivation, suggesting a possible link between socioeconomic factors and non-attendance rates Conclusion: These findings underscore the challenges of integrating care for SCD patients, particularly in relation to socioeconomic barriers. Despite the intent to streamline care and improve patient outcomes, the increase in DNA rates points to the need for further investigation into the underlying causes of non-attendance. Addressing these issues, especially in deprived areas, could enhance the effectiveness of combined clinics and ensure that patients receive the necessary monitoring and interventions for their eye health and overall well-being. Future strategies may need to focus on improving accessibility, outreach, and support for patients to mitigate the impact of socioeconomic factors on healthcare attendance.Keywords: south yorkshire, sickle cell anemia, deprivation, factors, haematology
Procedia PDF Downloads 13590 The Effect of Kangaroo Mother Care and Swaddling Method on Venipuncture Pain in Premature Infant: Randomized Clinical Trials
Authors: Faezeh Jahanpour, Shahin Dezhdar, Saeedeh Firouz Bakht, Afshin Ostovar
Abstract:
Objective: The hospitalized premature babies often undergo various painful procedures such as venous sampling. The Kangaroo mother care (KMC) method is one of the pain reduction methods, but as mother’s presence is not always possible, this research was done to compare the effect of swaddling and KMC method on venous sampling pain on premature neonates. Methods: In this randomized clinical trial 90 premature infants selected and randomly alocated into three groups; Group A (swaddling), Group B (the kangaroo care), and group C (the control). From 10 minutes before blood sampling to 2 minutes after that in group A, the infant was wrapped in a thin sheet, and in group B, the infant was under Kangaroo care. In all three groups, the heart rate and arterial oxygen saturation in time intervals of 30 seconds before, during, 30-60-90, and 120 seconds after sampling were measured and recorded. The infant’s face was video recorded since sampling till 2 minutes and the videos were checked by a researcher who was unaware of the kind of intervention and the pain assessment tools for infants (PIPP) for time intervals of 30 seconds were completed. Data analyzed by t-test, Q square, Repeated Measure ANOVA, Kruskal-Wallis, Post-hoc and Bonferroni test. Results: Findings revealed that the pain was reduced to a great extent in swaddling and kangaroo method compared to that in control group. But there was not a significant difference between kangaroo and swaddling care method (P ≥ 0.05). In addition, the findings showed that the heart rate and arterial oxygen saturation was low and stable in swaddling and Kangaroo care method and returned to base status faster, whereas, the changes were severe in control group and did not return to base status even after 120 seconds. Discussion: The results of this study showed that there was not a meaningful difference between swaddling and kangaroo care method on physiological indexes and pain in infants. Therefore, swaddling method can be a good substitute for kangaroo care method in this regard.Keywords: Kangaroo mother care, neonate, pain, premature, swaddling, venipuncture,
Procedia PDF Downloads 215589 Staphylococcus Aureus Septic Arthritis and Necrotizing Fasciitis in a Patient With Undiagnosed Diabetes Mellitus.
Authors: Pedro Batista, André Vinha, Filipe Castelo, Bárbara Costa, Ricardo Sousa, Raquel Ricardo, André Pinto
Abstract:
Background: Septic arthritis is a diagnosis that must be considered in any patient presenting with acute joint swelling and fever. Among the several risk factors for septic arthritis, such as age, rheumatoid arthritis, recent surgery, or skin infection, diabetes mellitus can sometimes be the main risk factor. Staphylococcus aureus is the most common pathogen isolated in septic arthritis; however, it is uncommon in monomicrobial necrotizing fasciitis. Objectives: A case report of concomitant septic arthritis and necrotizing fasciitis in a patient with undiagnosed diabetes based on clinical history. Study Design & Methods: We report a case of a 58-year-old Portuguese previously healthy man who presented to the emergency department with fever and left knee swelling and pain for two days. The blood work revealed ketonemia of 6.7 mmol/L and glycemia of 496 mg/dL. The vital signs were significant for a temperature of 38.5 ºC and 123 bpm of heart rate. The left knee had edema and inflammatory signs. Computed tomography of the left knee showed diffuse edema of the subcutaneous cellular tissue and soft tissue air bubbles. A diagnosis of septic arthritis and necrotising fasciitis was made. He was taken to the operating room for surgical debridement. The samples collected intraoperatively were sent for microbiological analysis, revealing infection by multi-sensitive Staphylococcus aureus. Given this result, the empiric flucloxacillin (500 mg IV) and clindamycin (1000 mg IV) were maintained for 3 weeks. On the seventh day of hospitalization, there was a significant improvement in subcutaneous and musculoskeletal tissues. After two weeks of hospitalization, there was no purulent content and partial closure of the wounds was possible. After 3 weeks, he was switched to oral antibiotics (flucloxacillin 500 mg). A week later, a urinary infection by Pseudomonas aeruginosa was diagnosed and ciprofloxacin 500 mg was administered for 7 days without complications. After 30 days of hospital admission, the patient was discharged home and recovered. Results: The final diagnosis of concomitant septic arthritis and necrotizing fasciitis was made based on the imaging findings, surgical exploration and microbiological tests results. Conclusions: Early antibiotic administration and surgical debridement are key in the management of septic arthritis and necrotizing fasciitis. Furthermore, risk factors control (euglycemic blood glucose levels) must always be taken into account given the crucial role in the patient's recovery.Keywords: septic arthritis, Necrotizing fasciitis, diabetes, Staphylococcus Aureus
Procedia PDF Downloads 315588 Chemical Analysis of Particulate Matter (PM₂.₅) and Volatile Organic Compound Contaminants
Authors: S. Ebadzadsahraei, H. Kazemian
Abstract:
The main objective of this research was to measure particulate matter (PM₂.₅) and Volatile Organic Compound (VOCs) as two classes of air pollutants, at Prince George (PG) neighborhood in warm and cold seasons. To fulfill this objective, analytical protocols were developed for accurate sampling and measurement of the targeted air pollutants. PM₂.₅ samples were analyzed for their chemical composition (i.e., toxic trace elements) in order to assess their potential source of emission. The City of Prince George, widely known as the capital of northern British Columbia (BC), Canada, has been dealing with air pollution challenges for a long time. The city has several local industries including pulp mills, a refinery, and a couple of asphalt plants that are the primary contributors of industrial VOCs. In this research project, which is the first study of this kind in this region it measures physical and chemical properties of particulate air pollutants (PM₂.₅) at the city neighborhood. Furthermore, this study quantifies the percentage of VOCs at the city air samples. One of the outcomes of this project is updated data about PM₂.₅ and VOCs inventory in the selected neighborhoods. For examining PM₂.₅ chemical composition, an elemental analysis methodology was developed to measure major trace elements including but not limited to mercury and lead. The toxicity of inhaled particulates depends on both their physical and chemical properties; thus, an understanding of aerosol properties is essential for the evaluation of such hazards, and the treatment of such respiratory and other related diseases. Mixed cellulose ester (MCE) filters were selected for this research as a suitable filter for PM₂.₅ air sampling. Chemical analyses were conducted using Inductively Coupled Plasma Mass Spectrometry (ICP-MS) for elemental analysis. VOCs measurement of the air samples was performed using a Gas Chromatography-Flame Ionization Detector (GC-FID) and Gas Chromatography-Mass Spectrometry (GC-MS) allowing for quantitative measurement of VOC molecules in sub-ppb levels. In this study, sorbent tube (Anasorb CSC, Coconut Charcoal), 6 x 70-mm size, 2 sections, 50/100 mg sorbent, 20/40 mesh was used for VOCs air sampling followed by using solvent extraction and solid-phase micro extraction (SPME) techniques to prepare samples for measuring by a GC-MS/FID instrument. Air sampling for both PM₂.₅ and VOC were conducted in summer and winter seasons for comparison. Average concentrations of PM₂.₅ are very different between wildfire and daily samples. At wildfire time average of concentration is 83.0 μg/m³ and daily samples are 23.7 μg/m³. Also, higher concentrations of iron, nickel and manganese found at all samples and mercury element is found in some samples. It is able to stay too high doses negative effects.Keywords: air pollutants, chemical analysis, particulate matter (PM₂.₅), volatile organic compound, VOCs
Procedia PDF Downloads 142587 Effect of Fermented Orange Juice Intake on Urinary 6‑Sulfatoxymelatonin in Healthy Volunteers
Authors: I. Cerrillo, A. Carrillo-Vico, M. A. Ortega, B. Escudero-López, N. Álvarez-Sánchez, F. Martín, M. S. Fernández-Pachón
Abstract:
Melatonin is a bioactive compound involved in multiple biological activities such as glucose tolerance, circadian rhythm regulation, antioxidant defense or immune system action. In elderly subjects the intake of foods and drinks rich in melatonin is very important due to its endogenous level decreases with age. Alcoholic fermentation is a process carried out in fruits, vegetables and legumes to obtain new products with improved bioactive compounds profile in relation to original substrates. Alcoholic fermentation process carried out by Saccharomycetaceae var. Pichia kluyveri induces an important synthesis of melatonin in orange juice. A novel beverage derived of fermented orange juice could be a promising source of this bioactive compound. The aim of the present study was to determine whether the acute intake of fermented orange juice increase the levels of urinary 6-sulfatoxymelatonin in healthy humans. Nine healthy volunteers (7 women and 2 men), aged between 20 and 25 years old and BMI of 21.1 2.4 kg/m2, were recruited. On the study day, participants ingested 500 mL of fermented orange juice. The first urine collection was made before fermented orange juice consumption (basal). The rest of urine collections were made in the following time intervals after fermented orange juice consumption: 0-2, 2-5, 5-10, 10- 15 and 15-24 hours. During the experimental period only the consumption of water was allowed. At lunch time a meal was provided (60 g of white bread, two slices of ham, a slice of cheese, 125 g of sweetened natural yoghurt and water). The subjects repeated the protocol with orange juice following a 2-wk washout period between both types of beverages. The levels of 6-sulfatoxymelatonin (6-SMT) were measured in urine recollected at different time points using the Melatonin-Sulfate Urine ELISA (IBL International GMBH, Hamburg, Germany). Levels of 6-SMT were corrected to those of creatinine for each sample. A significant (p < 0.05) increase in urinary 6-SMT levels was observed between 2-5 hours after fermented orange juice ingestion with respect to basal values (increase of 67,8 %). The consumption of orange juice did not induce any significant change in urinary 6-SMT levels. In addition, urinary 6-SMT levels obtained between 2-5 hours after fermented orange juice ingestion (115,6 ng/mg) were significantly different (p < 0.05) from those of orange juice (42,4 ng/mg). The enhancement of urinary 6-SMT after the ingestion of 500 mL of fermented orange juice in healthy humans compared to orange juice could be an important advantage of this novel product as an excellent source of melatonin. Fermented orange juice could be a new functional food, and its consumption could exert a potentially positive effect on health in both the maintenance of health status and the prevention of chronic diseases.Keywords: fermented orange juice, functional beverage, healthy human, melatonin
Procedia PDF Downloads 405586 Pattern of External Injuries Sustained during Bomb Blast Attacks in Karachi, Pakistan from 2000 to 2007
Authors: Arif Anwar Surani, Salman Ali, Asif Surani, Sohaib Zahid, Akbar Shoukat Ali, Zeeshan-Ul-Hassan Usmani, Joseph Varon, Salim Surani
Abstract:
Objective: Terrorism and suicidal bomb blast attacks are commonplace in Karachi, Pakistan. During the years 2000 to 2007, there were over 60 bomb explosions resulting in more than 1500 casualties. These explosions produce a wide variety of external injuries. We undertook this study to evaluate pattern of external injury produced after bomb blast attacks and to compare injury profile resulting from explosions in open versus semi-confined blast environments. Method: A retrospective, cross-sectional, study was conducted to review injuries sustained after bomb blast attacks in Karachi, Pakistan, from January 2000 to October 2007. Emergency medical records and medico legal certificates of patients presented to three major public sector hospitals of Karachi were evaluated using self-design proforma. Results: Data of 481 victims meet inclusion criteria and were incorporated for final analysis. Of these, 63.6% were injured in open spaces and 36.4% were injured in semi-confined blast environments. Lacerations were commonly encountered as external injury (47.7%) followed by penetrating wounds (15.3%). Lower and upper extremities were most commonly affected (38.6% and 19% respectively). Open and semi-confined blast environments produced a specific injury pattern and profile (p=<0.001). Conclusions: Bomb blast attacks in Karachi produce an external injury pattern consistent with other studies, with exception of an increased frequency in penetrating wounds. Semi-confined blast environments were associated with severe injuries. Further studies are required to better classify injuries and their severity based on standardized scoring systems. Effective emergency response systems must be designed to cope with mass causalities following bomb explosions.Keywords: bomb blast attacks, injury pattern, external injury, open space, semi-confined space, blast environment
Procedia PDF Downloads 396585 Food for Health: Understanding the Importance of Food Safety in the Context of Food Security
Authors: Carmen J. Savelli, Romy Conzade
Abstract:
Background: Access to sufficient amounts of safe and nutritious food is a basic human necessity, required to sustain life and promote good health. Food safety and food security are therefore inextricably linked, yet the importance of food safety in this relationship is often overlooked. Methodologies: A literature review and desk study were conducted to examine existing frameworks for discussing food security, especially from an international perspective, to determine the entry points for enhancing considerations for food safety in national and international policies. Major Findings: Food security is commonly understood as the state when all people at all times have physical, social and economic access to sufficient, safe and nutritious food to meet their dietary needs and food preferences for an active and healthy life. Conceptually, food security is built upon four pillars including food availability, access, utilization and stability. Within this framework, the safety of food is often wrongly assumed as a given. However, in places where food supplies are insufficient, coping mechanisms for food insecurity are primarily focused on access to food without considerations for ensuring safety. Under such conditions, hygiene and nutrition are often ignored as people shift to less nutritious diets and consume more potentially unsafe foods, in which chemical, microbiological, zoonotic and other hazards can pose serious, acute and chronic health risks. While food supplies might be safe and nutritious, if consumed in quantities insufficient to support normal growth, health and activity, the result is hunger and famine. Recent estimates indicate that at least 842 million people, or roughly one in eight, still suffer from chronic hunger. Even if people eat enough food that is safe, they will become malnourished if the food does not provide the proper amounts of micronutrients and/or macronutrients to meet daily nutritional requirements, resulting in under- or over-nutrition. Two billion people suffer from one or more micronutrient deficiencies and over half a billion adults are obese. Access to sufficient amounts of nutritious food is not enough. If food is unsafe, whether arising from poor quality supplies or inadequate treatment and preparation, it increases the risk of foodborne infections such as diarrhoea. 70% of diarrhoea episodes occurring annually in children under five are due to biologically contaminated food. Conclusions: An integrated approach is needed where food safety and nutrition are systematically introduced into mainstream food system policies and interventions worldwide in order to achieve health and development goals. A new framework, “Food for Health” is proposed to guide policy development and requires all three aspects of food security to be addressed in balance: sufficiency, nutrition and safety.Keywords: food safety, food security, nutrition, policy
Procedia PDF Downloads 421584 Elevated Celiac Antibodies and Abnormal Duodenal Biopsies Associated with IBD Markers: Possible Role of Altered Gut Permeability and Inflammation in Gluten Related Disorders
Authors: Manav Sabharwal, Ruda Rai Md, Candace Parker, James Ridley
Abstract:
Wheat is one of the most commonly consumed grains worldwide, which contains gluten. Nowadays, gluten intake is considered to be a trigger for GRDs, including Celiac disease (CD), a common genetic disease affecting 1% of the US population, non-celiac gluten sensitivity (NCGS) and wheat allergy. NCGS is being recognized as an acquired gluten-sensitive enteropathy that is prevalent across age, ethnic and geographic groups. The cause of this entity is not fully understood, and recent studies suggest that it is more common in participants with irritable bowel syndrome (IBS), with iron deficiency anemia, symptoms of fatigue, and has considerable overlap in symptoms with IBS and Crohn’s disease. However, these studies were lacking in availability of complete serologies, imaging tests and/or pan-endoscopy. We performed a prospective study of 745 adult patients who presented to an outpatient clinic for evaluation of chronic upper gastro-intestinal symptoms and subsequently underwent an upper endoscopic (EGD) examination as standard of care. Evaluation comprised of comprehensive celiac antibody panel, inflammatory bowel disease (IBD) serologic markers, duodenal biopsies and Small Bowel Video Capsule Endoscopy (VCE), when available. At least 6 biopsy specimens were obtained from the duodenum and proximal jejunum during EGD, and CD3+ Intraepithelial lymphocytes (IELs) and villous architecture were evaluated by a single experienced pathologist, and VCE was performed by a single experienced gastroenterologist. Of the 745 patients undergoing EGD, 12% (93/745) patients showed elevated CD3+ IELs in the duodenal biopsies. 52% (387/745) completed a comprehensive CD panel and 7.2% (28/387) were positive for at least 1 CD antibody (Tissue transglutaminase (tTG), being the most common antibody in 65% (18/28)). Of these patients, 18% (5/28) showed increased duodenal CD3+ IELs, but 0% showed villous blunting or distortion to meet criteria for CD. Surprisingly, 43% (12/28) were positive for at 1 IBD serology (ASCA, ANCA or expanded IBD panel (LabCorp)). Of these 28 patients, 29% (8/28) underwent a SB VCE, of which 100 % (8/8) showed significant jejuno-ileal mucosal lesions diagnostic for IBD. Findings of abnormal CD antibodies (7.2%, 28/387) and increased CD3+ IELs on duodenal biopsy (12%, 93/745) were observed frequently in patients with UGI symptoms undergoing EGD in an outpatient clinic. None met criteria for CD, and a high proportion (43%, 12/28) showed evidence of overlap with IBD. This suggests a potential causal link of acquired GRDs to underlying inflammation and gut mucosal barrier disruption. Further studies to investigate a role for abnormal antigen presentation of dietary gluten to gut associated lymphoid tissue as a cause are justified. This may explain a high prevalence of GRDs in the population and correlation with IBS, IBD and other gut inflammatory disorders.Keywords: celiac, gluten sensitive enteropathy, lymphocitic enteritis, IBS, IBD
Procedia PDF Downloads 169583 The Positive Effects of Social Distancing on Individual Work Outcomes in the Context of COVID-19
Authors: Fan Wei, Tang Yipeng
Abstract:
The outbreak of COVID-19 in early 2020 has been raging around the world, which has severely affected people's work and life. In today's post-pandemic era, although the pandemic has been effectively controlled, people still need to maintain social distancing at all times to prevent the further spread of the virus. Based on this, social distancing in the context of the pandemic has aroused widespread attention from scholars. At present, most studies exploring the influencing factors of social distancing are studying the negative impact of social distancing on the physical and mental state of special groups from the inter-individual level, and their more focus on the forced complete social distancing during the severe period of the pandemic. Few studies have focused on the impact of social distancing on working groups in the post-pandemic era from the within-individual level. In order to explore this problem, this paper constructs a cross-level moderating model based on resource conservation theory from the perspective of psychological resources. A total of 81 subjects were recruited to fill in the three-stage questionnaires each day for 10 working days, and 661valid questionnaires were finally obtained. Through the empirical tests, the following conclusions were finally obtained: (1) At the within-individual level, daily social distancing is positively correlated with the second day’s recovery, and the individual’s low sociability regulates the relationship between social distancing and recovery. The indirect effect of daily social distancing through recovery has positive relationship employees’ work engagement and work-goal progress only when the individual has low sociability. For individuals with high sociability, none of these paths are significant. (2) At the within-individual level, there is a significant relationship between individual's recovery and work engagement and work-goal progress, indicating that the recovery of resources can produce positive work outcomes. According to the results, this study believes that in the post-pandemic era, social distancing can not only effectively prevent and control the pandemic but also have positive impacts. Employees can use the time and energy originally saved for social activities through social distancing to invest in things that can provide resources and help them recover.Keywords: social distancing, recovery, work engagement, work goal progress, sociability
Procedia PDF Downloads 133582 Tuberculosis (TB) and Lung Cancer
Authors: Asghar Arif
Abstract:
Lung cancer has been recognized as one of the greatest common cancers, causing the annual mortality rate of about 1.2 million people in the world. Lung cancer is the most prevalent cancer in men and the third-most common cancer among women (after breast and digestive cancers).Recent evidences have shown the inflammatory process as one of the potential factors of cancer. Tuberculosis (TB), pneumonia, and chronic bronchitis are among the most important inflammation-inducing factors in the lungs, among which TB has a more profound role in the emergence of cancer.TB is one of the important mortality factors throughout the world, and 205,000 death cases are reported annually due to this disease. Chronic inflammation and fibrosis due to TB can induce genetic mutation and alternations. Parenchyma tissue of lung is involved in both diseases of TB and lung cancer, and continuous cough in lung cancer, morphological vascular variations, lymphocytosis processes, and generation of immune system mediators such as interleukins, are all among the factors leading to the hypothesis regarding the role of TB in lung cancer Some reports have shown that the induction of necrosis and apoptosis or TB reactivation, especially in patients with immune-deficiency, may result in increasing IL-17 and TNF_α, which will either decrease P53 activity or increase the expression of Bcl-2, decrease Bax-T, and cause the inhibition of caspase-3 expression due to decreasing the expression of mitochondria cytochrome oxidase. It has been also indicated that following the injection of BCG vaccine, the host immune system will be reinforced, and in particular, the rates of gamma interferon, nitric oxide, and interleukin-2 are increased. Therefore, CD4 + lymphocyte function will be improved, and the person will be immune against cancer.Numerous prospective studies have so far been conducted on the role of TB in lung cancer, and it seems that this disease is effective in that particular cancer.One of the main challenges of lung cancer is its correct and timely diagnosis. Unfortunately, clinical symptoms (such as continuous cough, hemoptysis, weight loss, fever, chest pain, dyspnea, and loss of appetite) and radiological images are similar in TB and lung cancer. Therefore, anti-TB drugs are routinely prescribed for the patients in the countries with high prevalence of TB, like Pakistan. Regarding the similarity in clinical symptoms and radiological findings of lung cancer, proper diagnosis is necessary for TB and respiratory infections due to nontuberculousmycobacteria (NTM). Some of the drug resistive TB cases are, in fact, lung cancer or NTM lung infections. Acid-fast staining and histological study of phlegm and bronchial washing, culturing and polymerase chain reaction TB are among the most important solutions for differential diagnosis of these diseases. Briefly, it is assumed that TB is one of the risk factors for cancer. Numerous studies have been conducted in this regard throughout the world, and it has been observed that there is a significant relationship between previous TB infection and lung cancer. However, to prove this hypothesis, further and more extensive studies are required. In addition, as the clinical symptoms and radiological findings of TB, lung cancer, and non-TB mycobacteria lung infections are similar, they can be misdiagnosed as TB.Keywords: TB and lung cancer, TB people, TB servivers, TB and HIV aids
Procedia PDF Downloads 72581 Numerical Study of the Breakdown of Surface Divergence Based Models for Interfacial Gas Transfer Velocity at Large Contamination Levels
Authors: Yasemin Akar, Jan G. Wissink, Herlina Herlina
Abstract:
The effect of various levels of contamination on the interfacial air–water gas transfer velocity is studied by Direct Numerical Simulation (DNS). The interfacial gas transfer is driven by isotropic turbulence, introduced at the bottom of the computational domain, diffusing upwards. The isotropic turbulence is generated in a separate, concurrently running the large-eddy simulation (LES). The flow fields in the main DNS and the LES are solved using fourth-order discretisations of convection and diffusion. To solve the transport of dissolved gases in water, a fifth-order-accurate WENO scheme is used for scalar convection combined with a fourth-order central discretisation for scalar diffusion. The damping effect of the surfactant contamination on the near surface (horizontal) velocities in the DNS is modelled using horizontal gradients of the surfactant concentration. An important parameter in this model, which corresponds to the level of contamination, is ReMa⁄We, where Re is the Reynolds number, Ma is the Marangoni number, and We is the Weber number. It was previously found that even small levels of contamination (ReMa⁄We small) lead to a significant drop in the interfacial gas transfer velocity KL. It is known that KL depends on both the Schmidt number Sc (ratio of the kinematic viscosity and the gas diffusivity in water) and the surface divergence β, i.e. K_L∝√(β⁄Sc). Previously it has been shown that this relation works well for surfaces with low to moderate contamination. However, it will break down for β close to zero. To study the validity of this dependence in the presence of surface contamination, simulations were carried out for ReMa⁄We=0,0.12,0.6,1.2,6,30 and Sc = 2, 4, 8, 16, 32. First, it will be shown that the scaling of KL with Sc remains valid also for larger ReMa⁄We. This is an important result that indicates that - for various levels of contamination - the numerical results obtained at low Schmidt numbers are also valid for significantly higher and more realistic Sc. Subsequently, it will be shown that - with increasing levels of ReMa⁄We - the dependency of KL on β begins to break down as the increased damping of near surface fluctuations results in an increased damping of β. Especially for large levels of contamination, this damping is so severe that KL is found to be underestimated significantly.Keywords: contamination, gas transfer, surfactants, turbulence
Procedia PDF Downloads 300580 Decommissioning of Nuclear Power Plants: The Current Position and Requirements
Abstract:
Undoubtedly from construction's perspective, the use of explosives will remove a large facility such as a 40-storey building , that took almost 3 to 4 years for construction, in few minutes. Usually, the reconstruction or decommissioning, the last phase of life cycle of any facility, is considered to be the shortest. However, this is proved to be wrong in the case of nuclear power plant. Statistics says that in the last 30 years, the construction of a nuclear power plant took an average time of 6 years whereas it is estimated that decommissioning of such plants may take even a decade or more. This paper is all about the decommissioning phase of a nuclear power plant which needs to be given more attention and encouragement from the research institutes as well as the nuclear industry. Currently, there are 437 nuclear power reactors in operation and 70 reactors in construction. With around 139 nuclear facilities already been shut down and are in different decommissioning stages and approximately 347 nuclear reactors will be in decommissioning phase in the next 20 years (assuming the operation time of a reactor as 40 years), This fact raises the following two questions (1) How far is the nuclear and construction Industry ready to face the challenges of decommissioning project? (2) What is required for a safety and reliable decommissioning project delivery? The decommissioning of nuclear facilities across the global have severe time and budget overruns. Largely the decommissioning processes are being executed by the force of manual labour where the change in regulations is respectively observed. In term of research and development, some research projects and activities are being carried out in this area, but the requirement seems to be much more. The near future of decommissioning shall be better through a sustainable development strategy where all stakeholders agree to implement innovative technologies especially for dismantling and decontamination processes and to deliever a reliable and safety decommissioning. The scope of technology transfer from other industries shall be explored. For example, remotery operated robotic technologies used in automobile and production industry to reduce time and improve effecincy and saftey shall be tried here. However, the innovative technologies are highly requested but they are alone not enough, the implementation of creative and innovative management methodologies should be also investigated and applied. Lean Management with it main concept "elimination of waste within process", is a suitable example here. Thus, the cooperation between international organisations and related industries and the knowledge-sharing may serve as a key factor for the successful decommissioning projects.Keywords: decommissioning of nuclear facilities, innovative technology, innovative management, sustainable development
Procedia PDF Downloads 471579 Towards the Need of Resilient Design and Its Assessment in South China
Authors: Alan Lai, Wilson Yik
Abstract:
With rapid urbanization, there has been a dramatic increase in global urban population in Asia and over half of population in Asia will live in urban regions in the near future. Facing with increasing exposure to climate-related stresses and shocks, most of the Asian cities will very likely to experience more frequent heat waves and flooding with rising sea levels, particularly the coastal cities will grapple for intense typhoons and storm surges. These climate changes have severe impacts in urban areas at the costs of infrastructure and population, for example, human health, wellbeing and high risks of dengue fever, malaria and diarrheal disease. With the increasing prominence of adaptation to climate changes, there have been changes in corresponding policies. Smaller cities have greater potentials for integrating the concept of resilience into their infrastructure as well as keeping pace with their rapid growths in population. It is therefore important to explore the potentials of Asian cities adapting to climate change and the opportunities of building climate resilience in urban planning and building design. Furthermore, previous studies have mainly attempted at exploiting the potential of resilience on a macro-level within urban planning rather than that on micro-level within the context of individual building. The resilience of individual building as a research field has not yet been much explored. Nonetheless, recent studies define that the resilience of an individual building is the one which is able to respond to physical damage and recover from such damage in a quickly and cost-effectively manner, while maintain its primary functions. There is also a need to develop an assessment tool to evaluate the resilience on building scale which is still largely uninvestigated as it should be regarded as a basic function of a building. Due to the lack of literature reporting metric for assessing building resilience with sustainability, the research will be designed as a case study to provide insight into the issue. The aim of this research project is to encourage and assist in developing neighborhood climate resilience design strategies for Hong Kong so as to bridge the gap between difference scales and that between theory and practice.Keywords: resilience cities, building resilience, resilient buildings and infrastructure, climate resilience, hot and humid southeast area, high-density cities
Procedia PDF Downloads 163578 Principal Well-Being at Hong Kong: A Quantitative Investigation
Authors: Junjun Chen, Yingxiu Li
Abstract:
The occupational well-being of school principals has played a vital role in the pursuit of individual and school wellness and success. However, principals’ well-being worldwide is under increasing threat because of the challenging and complex nature of their work and growing demands for school standardisation and accountability. Pressure is particularly acute in the post-pandemicfuture as principals attempt to deal with the impact of the pandemic on top of more regular demands. This is particularly true in Hong Kong, as school principals are increasingly wedged between unparalleled political, social, and academic responsibilities. Recognizing the semantic breadth of well-being, scholars have not determined a single, mutually agreeable definition but agreed that the concept of well-being has multiple dimensions across various disciplines. The multidimensional approach promises more precise assessments of the relationships between well-being and other concepts than the ‘affect-only’ approach or other single domains for capturing the essence of principal well-being. The multiple-dimension well-being concept is adopted in this project to understand principal well-being in this study. This study aimed to understand the situation of principal well-being and its influential drivers with a sample of 670 principals from Hong Kong and Mainland China. An online survey was sent to the participants after the breakout of COVID-19 by the researchers. All participants were well informed about the purposes and procedure of the project and the confidentiality of the data prior to filling in the questionnaire. Confirmatory factor analysis and structural equation modelling performed with Mplus were employed to deal with the dataset. The data analysis procedure involved the following three steps. First, the descriptive statistics (e.g., mean and standard deviation) were calculated. Second, confirmatory factor analysis (CFA) was used to trim principal well-being measurement performed with maximum likelihood estimation. Third, structural equation modelling (SEM) was employed to test the influential factors of principal well-being. The results of this study indicated that the overall of principal well-being were above the average mean score. The highest ranking in this study given by the principals was to their psychological and social well-being (M = 5.21). This was followed by spiritual (M = 5.14; SD = .77), cognitive (M = 5.14; SD = .77), emotional (M = 4.96; SD = .79), and physical well-being (M = 3.15; SD = .73). Participants ranked their physical well-being the lowest. Moreover, professional autonomy, supervisor and collegial support, school physical conditions, professional networking, and social media have showed a significant impact on principal well-being. The findings of this study will potentially enhance not only principal well-being, but also the functioning of an individual principal and a school without sacrificing principal well-being for quality education in the process. This will eventually move one step forward for a new future - a wellness society advocated by OECD. Importantly, well-being is an inside job that begins with choosing to have wellness, whilst supports to become a wellness principal are also imperative.Keywords: well-being, school principals, quantitative, influential factors
Procedia PDF Downloads 83577 Ecological Evaluation and Conservation Strategies of Economically Important Plants in Indian Arid Zone
Authors: Sher Mohammed, Purushottam Lal, Pawan K. Kasera
Abstract:
The Thar Desert of Rajasthan covers a wide geographical area spreading between 23.3° to 30.12°, North latitude and 69.3◦ to 76◦ Eastern latitudes; having a unique spectrum of arid zone vegetation. This desert is spreading over 12 districts having a rich source of economically important/threatened plant diversity interacting and growing with adverse climatic conditions of the area. Due to variable geological, physiographic, climatic, edaphic and biotic factors, the arid zone medicinal flora exhibit a wide collection of angiosperm families. The herbal diversity of this arid region is medicinally important in household remedies among tribal communities as well as in traditional systems. The on-going increasing disturbances in natural ecosystems are due to climatic and biological, including anthropogenic factors. The unique flora and subsequently dependent faunal diversity of the desert ecosystem is losing its biotic potential. A large number of plants have no future unless immediate steps are taken to arrest the causes, leading to their biological improvement. At present the potential loss in ecological amplitude of various genera and species is making several plant species as red listed plants of arid zone vegetation such as Commmiphora wightii, Tribulus rajasthanensis, Calligonum polygonoides, Ephedra foliata, Leptadenia reticulata, Tecomella undulata, Blepharis sindica, Peganum harmala, Sarcostoma vinimale, etc. Mostly arid zone species are under serious pressure against prevailing ecosystem factors to continuation their life cycles. Genetic, molecular, cytological, biochemical, metabolic, reproductive, germination etc. are the several points where the floral diversity of the arid zone area is facing severe ecological influences. So, there is an urgent need to conserve them. There are several opportunities in the field to carry out remarkable work at particular levels to protect the native plants in their natural habitat instead of only their in vitro multiplication.Keywords: ecology, evaluation, xerophytes, economically, threatened plants, conservation
Procedia PDF Downloads 267576 A 4-Month Low-carb Nutrition Intervention Study Aimed to Demonstrate the Significance of Addressing Insulin Resistance in 2 Subjects with Type-2 Diabetes for Better Management
Authors: Shashikant Iyengar, Jasmeet Kaur, Anup Singh, Arun Kumar, Ira Sahay
Abstract:
Insulin resistance (IR) is a condition that occurs when cells in the body become less responsive to insulin, leading to higher levels of both insulin and glucose in the blood. This condition is linked to metabolic syndromes, including diabetes. It is crucial to address IR promptly after diagnosis to prevent long-term complications associated with high insulin and high blood glucose. This four-month case study highlights the importance of treating the underlying condition to manage diabetes effectively. Insulin is essential for regulating blood sugar levels by facilitating the uptake of glucose into cells for energy or storage. In IR individuals, cells are less efficient at taking up glucose from the blood resulting in elevated blood glucose levels. As a result of IR, beta cells produce more insulin to make up for the body's inability to use insulin effectively. This leads to high insulin levels, a condition known as hyperinsulinemia, which further impairs glucose metabolism and can contribute to various chronic diseases. In addition to regulating blood glucose, insulin has anti-catabolic effects, preventing the breakdown of molecules in the body, such as inhibiting glycogen breakdown in the liver, inhibiting gluconeogenesis, and inhibiting lipolysis. If a person is insulin-sensitive or metabolically healthy, an optimal level of insulin prevents fat cells from releasing fat and promotes the storage of glucose and fat in the body. Thus optimal insulin levels are crucial for maintaining energy balance and plays a key role in metabolic processes. During the four-month study, researchers looked at the impact of a low-carb dietary (LCD) intervention on two male individuals (A & B) who had Type-2 diabetes. Althoughvneither of these individuals were obese, they were both slightly overweight and had abdominal fat deposits. Before the trial began, important markers such as fasting blood glucose (FBG), triglycerides (TG), high-density lipoprotein (HDL) cholesterol, and Hba1c were measured. These markers are essential in defining metabolic health, their individual values and variability are integral in deciphering metabolic health. The ratio of TG to HDL is used as a surrogate marker for IR. This ratio has a high correlation with the prevalence of metabolic syndrome and with IR itself. It is a convenient measure because it can be calculated from a standard lipid profile and does not require more complex tests. In this four-month trial, an improvement in insulin sensitivity was observed through the ratio of TG/HDL, which, in turn, improves fasting blood glucose levels and HbA1c. For subject A, HbA1c dropped from 13 to 6.28, and for subject B, it dropped from 9.4 to 5.7. During the trial, neither of the subjects were taking any diabetic medications. The significant improvements in their health markers, such as better glucose control, along with an increase in energy levels, demonstrate that incorporating LCD interventions can effectively manage diabetes.Keywords: metabolic disorder, insulin resistance, type-2 diabetes, low-carb nutrition
Procedia PDF Downloads 40575 Characterization of the Music Admission Requirements and Evaluation of the Relationship among Motivation and Performance Achievement
Authors: Antonio M. Oliveira, Patricia Oliveira-Silva, Jose Matias Alves, Gary McPherson
Abstract:
The music teaching is oriented towards offering formal music training. Due to its specificities, this vocational program starts at a very young age. Although provided by the State, the offer is limited to 6 schools throughout the country, which means that the vacancies for prospective students are very limited every year. It is therefore crucial that these vacancies be taken by especially motivated children grown within households that offer the ideal setting for success. Some of the instruments used to evaluate musical performance are highly sensitive to specific previous training, what represents a severe validity problem for testing children who have had restricted opportunities for formal training. Moreover, these practices may be unfair because, for instance, they may not reflect the candidates’ music aptitudes. Based on what constitutes a prerequisite for making an excellent music student, researchers in this field have long argued that motivation, task commitment, and parents’ support are as important as ability. Thus, the aim of this study is: (1) to prepare an inventory of admission requirements in Australia, Portugal and Ireland; (2) to examine whether the candidates to music conservatories and parents’ level of motivation, assessed at three evaluation points (i.e., admission, at the end of the first year, and at the end of the second year), correlates positively with the candidates’ progress in learning a musical instrument (i.e., whether motivation at the admission may predict student musicianship); (3) an adaptation of an existing instrument to assess the motivation (i.e., to adapt the items to the music setting, focusing on the motivation for playing a musical instrument). The inclusion criteria are: only children registered in the administrative services to be evaluated for entrance to the conservatory will be accepted for this study. The expected number of participants is fifty (5-6 years old) in all the three frequency schemes: integrated, articulated and supplementary. Revisiting musical admission procedures is of particular importance and relevance to musical education because this debate may bring guidance and assistance about the needed improvement to make the process of admission fairer and more transparent.Keywords: music learning, music admission requirements, student’s motivation, parent’s motivation
Procedia PDF Downloads 166574 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia
Authors: Rohan Bhasin
Abstract:
Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM
Procedia PDF Downloads 164573 Mood Symptom Severity in Service Members with Posttraumatic Stress Symptoms after Service Dog Training
Authors: Tiffany Riggleman, Andrea Schultheis, Kalyn Jannace, Jerika Taylor, Michelle Nordstrom, Paul F. Pasquina
Abstract:
Introduction: Posttraumatic Stress (PTS) and Posttraumatic Stress Disorder (PTSD) remain significant problems for military and veteran communities. Symptoms of PTSD often include poor sleep, intrusive thoughts, difficulty concentrating, and trouble with emotional regulation. Unfortunately, despite its high prevalence, service members diagnosed with PTSD often do not seek help, usually because of the perceived stigma surrounding behavioral health care. To help address these challenges, non-pharmacological, therapeutic approaches are being developed to help improve care and enhance compliance. The Service Dog Training Program (SDTP), which involves teaching patients how to train puppies to become mobility service dogs, has been successfully implemented into PTS/PTSD care programs with anecdotal reports of improved outcomes. This study was designed to assess the biopsychosocial effects of SDTP from military beneficiaries with PTS symptoms. Methods: Individuals between the ages of 18 and 65 with PTS symptom were recruited to participate in this prospective study. Each subject completes 4 weeks of baseline testing, followed by 6 weeks of active service dog training (twice per week for one hour sessions) with a professional service dog trainer. Outcome measures included the Posttraumatic Stress Checklist for the DSM-5 (PCL-5), Generalized Anxiety Disorder questionnaire-7 (GAD-7), Patient Health Questionnaire-9 (PHQ-9), social support/interaction, anthropometrics, blood/serum biomarkers, and qualitative interviews. Preliminary analysis of 17 participants examined mean scores on the GAD-7, PCL-5, and PHQ-9, pre- and post-SDTP, and changes were assessed using Wilcoxon Signed-Rank tests. Results: Post-SDTP, there was a statistically significant mean decrease in PCL-5 scores of 13.5 on an 80-point scale (p=0.03) and a significant mean decrease of 2.2 in PHQ-9 scores on a 27 point scale (p=0.04), suggestive of decreased PTSD and depression symptoms. While there was a decrease in mean GAD-7 scores post-SDTP, the difference was not significant (p=0.20). Recurring themes among results from the qualitative interviews include decreased pain, forgetting about stressors, improved sense of calm, increased confidence, improved communication, and establishing a connection with the service dog. Conclusion: Preliminary results of the first 17 participants in this study suggest that individuals who received SDTP had a statistically significant decrease in PTS symptom, as measured by the PCL-5 and PHQ-9. This ongoing study seeks to enroll a total of 156 military beneficiaries with PTS symptoms. Future analyses will include additional psychological outcomes, pain scores, blood/serum biomarkers, and other measures of the social aspects of PTSD, such as relationship satisfaction and sleep hygiene.Keywords: post-concussive syndrome, posttraumatic stress, service dog, service dog training program, traumatic brain injury
Procedia PDF Downloads 113572 Prevalence of Hemorrhagic Septicemia in Dromedary Camel (Camelus Dromedarius) for Some Selected Farms in Benadir Region, Somalia
Authors: Abdirahman Barre, Abdihamid Salad Hassan, Iftin Abdi Mohamud, Abdirahman Mohamed Mohamud, Ahmed Adan Mohamed, Mukhtaar Mohamed Idow
Abstract:
Pasteurellosis (Hemorrhagic septicemia) is a common respiratory disease of camel that is an acutely fatal disease caused by Pasteurella multocida type A or several serotypes of Mannheimia hemolytic, which also affect other animals. The disease had shown to spread between animals, across herds and to humans. Meaning that the disease is Zoonosis. The study aimed at establishment of sero-prevalence of Pasteurellosis in some selected Districts of camel rearing in the Benadir Region. It was a cross-sectional study, where the study population was purposively chosen to consist of animals taken within three sub-Districts of Benadir Region, namely Sub-District (Daynile Township), Sub-District (Yaaqshid) Sub-District (kaxda). This was because they normally handle many camels in a day, thus making it easy for the investigator to access the required number conveniently; it was also assumed that data collected from these for-slaughter camels was representative of the situation in the sub-District/county. A total of one hundred and sixty camels were tested using four serological tests: Rose Bengal Plate Test (RBPT),) and Complex Fixation Test (CFT). The serological tests were purposively chosen to increase the chances of picking positive cases and also to compare their sensitivities with respect to camel serum since they were originally meant for use on bovine serum. Blood samples (15 ml) were collected for serum harvesting from the jugular veins of the animals as they were waiting to be examined. Rose Bengal plate test and CFT were run at a laboratory within the Department of Veterinary Medicine, University of Horsed, 21 October campus; serum samples having been transported in a cool box. On average, out of an overall total of 300 serum samples tested, 180 samples were selected as sample procedures and were given eleven (11) positive results, amounting to a prevalence of 6.67%. For the three Districts, respective prevalence (averaged from the two (2) serological tests run) were: 7% (3/50) for Yaqshiid; 8% (3/60) for Deyniile and 10% (3/70) for Kaxda. When sensitivities of the two (2) serological tests were compared, there was no significant difference between them with respect to the picking of positive cases (p=0.05). The study has demonstrated presence of Pasterolosis in camels in Benadir Region and the authors are recommending the usage of RBPT and CFT as screening tests, since they are cheap, quick, and easy to carry out. Any of the other three involving tests can then be used if one wants to establish respective titers. Therefore, further detailed investigation needs to be conducted so as to understand specific etiological agents causing pasteurollosis in camel and can be instituted to optimize the benefit obtained from the camel sector.Keywords: hemorrhagic septicemia, camel, prevalence, Benadir region, Somalia
Procedia PDF Downloads 72571 Identification of Rare Mutations in Genes Involved in Monogenic Forms of Obesity and Diabetes in Obese Guadeloupean Children through Next-Generation Sequencing
Authors: Lydia Foucan, Laurent Larifla, Emmanuelle Durand, Christine Rambhojan, Veronique Dhennin, Jean-Marc Lacorte, Philippe Froguel, Amelie Bonnefond
Abstract:
In the population of Guadeloupe Island (472,124 inhabitants and 80% of subjects of African descent), overweight and obesity were estimated at 23% and 9% respectively among children. High prevalence of diabetes has been reported (~10%) in the adult population. Nevertheless, no study has investigated the contribution of gene mutations to childhood obesity in this population. We aimed to investigate rare genetic mutations in genes involved in monogenic obesity or diabetes in obese Afro-Caribbean children from Guadeloupe Island using next-generation sequencing. The present investigation included unrelated obese children, from a previous study on overweight conducted in Guadeloupe Island in 2013. We sequenced coding regions of 59 genes involved in monogenic obesity or diabetes. A total of 25 obese schoolchildren (with Z-score of body mass index [BMI]: 2.0 to 2.8) were screened for rare mutations (non-synonymous, splice-site, or insertion/deletion) in 59 genes. Mean age of the study population was 12.4 ± 1.1 years. Seventeen children (68%) had insulin-resistance (HOMA-IR > 3.16). A family history of obesity (mother or father) was observed in eight children and three of the accompanying parent presented with type 2 diabetes. None of the children had gonadotrophic abnormality or mental retardation. We detected five rare heterozygous mutations, in four genes involved in monogenic obesity, in five different obese children: MC4R p.Ile301Thr and SIM1 p.Val326Thrfs*43 mutations which were pathogenic; SIM1 p.Ser343Pro and SH2B1 p.Pro90His mutations which were likely pathogenic; and NTRK2 p.Leu140Phe that was of uncertain significance. In parallel, we identified seven carriers of mutation in ABCC8 or KCNJ11 (involved in monogenic diabetes), which were of uncertain significance (KCNJ11 p.Val13Met, KCNJ11 p.Val151Met, ABCC8 p.Lys1521Asn and ABCC8 p.Ala625Val). Rare pathogenic or likely pathogenic mutations, linked to severe obesity were detected in more than 15% of this Afro-Caribbean population at high risk of obesity and type 2 diabetes.Keywords: childhood obesity, MC4R, monogenic obesity, SIM1
Procedia PDF Downloads 193570 A Foodborne Cholera Outbreak in a School Caused by Eating Contaminated Fried Fish: Hoima Municipality, Uganda, February 2018
Authors: Dativa Maria Aliddeki, Fred Monje, Godfrey Nsereko, Benon Kwesiga, Daniel Kadobera, Alex Riolexus Ario
Abstract:
Background: Cholera is a severe gastrointestinal disease caused by Vibrio cholera. It has caused several pandemics. On 26 February 2018, a suspected cholera outbreak, with one death, occurred in School X in Hoima Municipality, western Uganda. We investigated to identify the scope and mode of transmission of the outbreak, and recommend evidence-based control measures. Methods: We defined a suspected case as onset of diarrhea, vomiting, or abdominal pain in a student or staff of School X or their family members during 14 February–10 March. A confirmed case was a suspected case with V. cholerae cultured from stool. We reviewed medical records at Hoima Hospital and searched for cases at School X. We conducted descriptive epidemiologic analysis and hypothesis-generating interviews of 15 case-patients. In a retrospective cohort study, we compared attack rates between exposed and unexposed persons. Results: We identified 15 cases among 75 students and staff of School X and their family members (attack rate=20%), with onset from 25-28 February. One patient died (case-fatality rate=6.6%). The epidemic curve indicated a point-source exposure. On 24 February, a student brought fried fish from her home in a fishing village, where a cholera outbreak was ongoing. Of the 21 persons who ate the fish, 57% developed cholera, compared with 5.6% of 54 persons who did not eat (RR=10; 95% CI=3.2-33). None of 4 persons who recooked the fish before eating, compared with 71% of 17 who did not recook it, developed cholera (RR=0.0, 95%CIFisher exact=0.0-0.95). Of 12 stool specimens cultured, 6 yielded V. cholerae. Conclusion: This cholera outbreak was caused by eating fried fish, which might have been contaminated with V. cholerae in a village with an ongoing outbreak. Lack of thorough cooking of the fish might have facilitated the outbreak. We recommended thoroughly cooking fish before consumption.Keywords: cholera, disease outbreak, foodborne, global health security, Uganda
Procedia PDF Downloads 199569 Growth and Bone Health in Children following Liver Transplantation
Authors: Faris Alkhalil, Rana Bitar, Amer Azaz, Hisham Natour, Noora Almeraikhi, Mohamad Miqdady
Abstract:
Background: Children with liver transplantation are achieving very good survival and so there is now a need to concentrate on achieving good health in these patients and preventing disease. Immunosuppressive medications have side effects that need to be monitored and if possible avoided. Glucocorticoids and calcineurin inhibitors are detrimental to bone and mineral homeostasis in addition steroids can also affect linear growth. Steroid sparing regimes in renal transplant children has shown to improve children’s height. Aim: We aim to review the growth and bone health of children post liver transplant by measuring bone mineral density (BMD) using dual energy X-ray absorptiometry (DEXA) scan and assessing if there is a clear link between poor growth and impaired bone health and use of long term steroids. Subjects and Methods: This is a single centre retrospective Cohort study, we reviewed the medical notes of children (0-16 years) who underwent a liver transplantation between November 2000 to November 2016 and currently being followed at our centre. Results: 39 patients were identified (25 males and 14 females), the median transplant age was 2 years (range 9 months - 16 years), and the median follow up was 6 years. Four patients received a combined transplant, 2 kidney and liver transplant and 2 received a liver and small bowel transplant. The indications for transplant included, Biliary Atresia (31%), Acute Liver failure (18%), Progressive Familial Intrahepatic Cholestasis (15%), transplantable metabolic disease (10%), TPN related liver disease (8%), Primary Hyperoxaluria (5%), Hepatocellular carcinoma (3%) and other causes (10%). 36 patients (95%) were on a calcineurin inhibitor (34 patients were on Tacrolimus and 2 on Cyclosporin). The other three patients were on Sirolimus. Low dose long-term steroids was used in 21% of the patients. A considerable proportion of the patients had poor growth. 15% were below the 3rd centile for weight for age and 21% were below the 3rd centile for height for age. Most of our patients with poor growth were not on long term steroids. 49% of patients had a DEXA scan post transplantation. 21% of these children had low bone mineral density, one patient had met osteoporosis criteria with a vertebral fracture. Most of our patients with impaired bone health were not on long term steroids. 20% of the patients who did not undergo a DEXA scan developed long bone fractures and 50% of them were on long term steroid use which may suggest impaired bone health in these patients. Summary and Conclusion: The incidence of impaired bone health, although studied in limited number of patients; was high. Early recognition and treatment should be instituted to avoid fractures and improve bone health. Many of the patients were below the 3rd centile for weight and height however there was no clear relationship between steroid use and impaired bone health, reduced weight and reduced linear height.Keywords: bone, growth, pediatric, liver, transplantation
Procedia PDF Downloads 279