Search results for: two-loop control structure
692 Tool Development for Assessing Antineoplastic Drugs Surface Contamination in Healthcare Services and Other Workplaces
Authors: Benoit Atge, Alice Dhersin, Oscar Da Silva Cacao, Beatrice Martinez, Dominique Ducint, Catherine Verdun-Esquer, Isabelle Baldi, Mathieu Molimard, Antoine Villa, Mireille Canal-Raffin
Abstract:
Introduction: Healthcare workers' exposure to antineoplastic drugs (AD) is a burning issue for occupational medicine practitioners. Biological monitoring of occupational exposure (BMOE) is an essential tool for assessing AD contamination of healthcare workers. In addition to BMOE, surface sampling is a useful tool in order to understand how workers get contaminated, to identify sources of environmental contamination, to verify the effectiveness of surface decontamination way and to ensure monitoring of these surfaces. The objective of this work was to develop a complete tool including a kit for surface sampling and a quantification analytical method for AD traces detection. The development was realized with the three following criteria: the kit capacity to sample in every professional environment (healthcare services, veterinaries, etc.), the detection of very low AD traces with a validated analytical method and the easiness of the sampling kit use regardless of the person in charge of sampling. Material and method: AD mostly used in term of quantity and frequency have been identified by an analysis of the literature and consumptions of different hospitals, veterinary services, and home care settings. The kind of adsorbent device, surface moistening solution and mix of solvents for the extraction of AD from the adsorbent device have been tested for a maximal yield. The AD quantification was achieved by an ultra high-performance liquid chromatography method coupled with tandem mass spectrometry (UHPLC-MS/MS). Results: With their high frequencies of use and their good reflect of the diverse activities through healthcare, 15 AD (cyclophosphamide, ifosfamide, doxorubicin, daunorubicin, epirubicin, 5-FU, dacarbazin, etoposide, pemetrexed, vincristine, cytarabine, methothrexate, paclitaxel, gemcitabine, mitomycin C) were selected. The analytical method was optimized and adapted to obtain high sensitivity with very low limits of quantification (25 to 5000ng/mL), equivalent or lowest that those previously published (for 13/15 AD). The sampling kit is easy to use, provided with a didactic support (online video and protocol paper). It showed its effectiveness without inter-individual variation (n=5/person; n= 5 persons; p=0,85; ANOVA) regardless of the person in charge of sampling. Conclusion: This validated tool (sampling kit + analytical method) is very sensitive, easy to use and very didactic in order to control the chemical risk brought by AD. Moreover, BMOE permits a focal prevention. Used in routine, this tool is available for every intervention of occupational health.Keywords: surface contamination, sampling kit, analytical method, sensitivity
Procedia PDF Downloads 132691 Virtual Reference Service as a Space for Communication and Interaction: Providing Infrastructure for Learning in Times of Crisis at Uppsala University
Authors: Nadja Ylvestedt
Abstract:
Uppsala University Library is a geographically dispersed research library consisting of nine subject libraries located in different campus areas throughout the city of Uppsala. Despite the geographical dispersion, it is the library's ambition to be perceived as a cohesive library with consistently high service and quality. A key factor to being one cohesive library is the library's online services, especially the virtual reference service. E-mail, chat and phone are answered by a team of specially trained staff under the supervision of a team leader. When covid-19 hit, well-established routines and processes to provide an infrastructure for students and researchers at the university changed radically. The strong connection between services provided at the library locations as well as at the VRS has been one of the key components of the library’s success in providing patrons with the help they need. With radically minimized availability at the physical locations, the infrastructure was at risk of collapsing. Objectives:- The objective of this project has been to evaluate the consequences of the sudden change in the organization of the library. The focus of this evaluation is the library’s VRS as an important space for learning, interaction and communication between the library and the community when other traditional spaces were not available. The goal of this evaluation is to capture the lessons learned from providing infrastructure for learning and research in times of crisis both on a practical, user-centered level but also to stress the importance of leadership in ever-changing environments that supports and creates agile, flexible services and teams instead of rigid processes adhering to obsolete goals. Results:- Reduced availability at the physical library locations was one of the strategies to prevent the spread of the covid-19 virus. The library staff was encouraged to work from home, so student workers staffed the library’s physical locations during that time, leaving the VRS to be the only place where patrons could get expert help. The VRS had an increase of 65% of questions asked between spring term 2019 and spring term 2020. The VRS team had to navigate often complicated and fast-changing new routines depending on national guidelines. The VRS team has a strong emphasis on agility in their approach to the challenges and opportunities, with methods to evaluate decisions regularly with user experience in mind. Fast decision-making, collecting feedback, an open-minded approach to reviewing rules and processes with both a short-term and a long-term focus and providing a healthy work environment have been key factors in managing this crisis and learn from it. This was resting on a strong sense of ownership regarding the VRS, well-working communication tools and agile and active communication between team members, as well as between the team and the rest of the organization who served as a second-line support system to aid the VRS team. Moving forward, the VRS has become an important space for communication, interaction and provider of infrastructure, implementing new routines and more extensive availability due to the lessons learned during crisis. The evaluation shows that the virtual environment has become an important addition to the physical spaces, existing in its own right but always in connection with and in relationship with the library structure as a whole. Thereby showing that the basis of human interaction stays the same while its form morphs and adapts to changes, thus leaving the virtual environment as a space of communication and infrastructure with unique opportunities for outreach and the potential to become a staple in patron’s education and learning.Keywords: virtual reference service, leadership, digital infrastructure, research library
Procedia PDF Downloads 170690 Artificial Intelligence and Governance in Relevance to Satellites in Space
Authors: Anwesha Pathak
Abstract:
With the increasing number of satellites and space debris, space traffic management (STM) becomes crucial. AI can aid in STM by predicting and preventing potential collisions, optimizing satellite trajectories, and managing orbital slots. Governance frameworks need to address the integration of AI algorithms in STM to ensure safe and sustainable satellite activities. AI and governance play significant roles in the context of satellite activities in space. Artificial intelligence (AI) technologies, such as machine learning and computer vision, can be utilized to process vast amounts of data received from satellites. AI algorithms can analyse satellite imagery, detect patterns, and extract valuable information for applications like weather forecasting, urban planning, agriculture, disaster management, and environmental monitoring. AI can assist in automating and optimizing satellite operations. Autonomous decision-making systems can be developed using AI to handle routine tasks like orbit control, collision avoidance, and antenna pointing. These systems can improve efficiency, reduce human error, and enable real-time responsiveness in satellite operations. AI technologies can be leveraged to enhance the security of satellite systems. AI algorithms can analyze satellite telemetry data to detect anomalies, identify potential cyber threats, and mitigate vulnerabilities. Governance frameworks should encompass regulations and standards for securing satellite systems against cyberattacks and ensuring data privacy. AI can optimize resource allocation and utilization in satellite constellations. By analyzing user demands, traffic patterns, and satellite performance data, AI algorithms can dynamically adjust the deployment and routing of satellites to maximize coverage and minimize latency. Governance frameworks need to address fair and efficient resource allocation among satellite operators to avoid monopolistic practices. Satellite activities involve multiple countries and organizations. Governance frameworks should encourage international cooperation, information sharing, and standardization to address common challenges, ensure interoperability, and prevent conflicts. AI can facilitate cross-border collaborations by providing data analytics and decision support tools for shared satellite missions and data sharing initiatives. AI and governance are critical aspects of satellite activities in space. They enable efficient and secure operations, ensure responsible and ethical use of AI technologies, and promote international cooperation for the benefit of all stakeholders involved in the satellite industry.Keywords: satellite, space debris, traffic, threats, cyber security.
Procedia PDF Downloads 76689 Isolation of Clitorin and Manghaslin from Carica papaya L. Leaves by CPC and Its Quantitative Analysis by QNMR
Authors: Norazlan Mohmad Misnan, Maizatul Hasyima Omar, Mohd Isa Wasiman
Abstract:
Papaya (Carica papaya L., Caricaceae) is a tree which mainly cultivated for its fruits in many tropical regions including Australia, Brazil, China, Hawaii, and Malaysia. Beside of fruits, its leaves, seeds, and latex have also been traditionally used for treating diseases, which also reported to possess anti-cancer and anti- malaria properties. Its leaves have been reported to consist of various chemical compounds such as alkaloids, flavonoids and phenolics. Clitorin and manghaslin are among major flavonoids presence. Thus, the aim of this study is to quantify the purity of these isolated compounds (clitorin and manghsalin) by using quantitative Nuclear Magnetic Resonance (qNMR) analysis. Only fresh C. papaya leaves were used for juice extraction procedure and subsequently was freeze-dried to obtain a dark green powdered form of the extract prior to Centrifugal Partition Chromatography (CPC) separation. The CPC experiments were performed using a two-phase solvent system comprising ethyl acetate/butanol/water (1:4:5, v/v/v/v) solvent. The upper organic phase was used as the stationary phase, and the lower aqueous phase was employed as the mobile phase. Ten fractions were obtained after an hour runtime analysis. Fraction 6 and fraction 8 has been identified as clitorin (m/z 739.21 [M-H]-) and manghaslin (m/z 755.21 [M-H]-), respectively, based on LCMS data and full analysis of NMR (1H NMR, 13C NMR, HMBC, and HSQC). The 1H-qNMR measurements were carried out using a 400 MHz NMR spectrometer (JEOL ECS 400MHz, Japan) and deuterated methanol was used as a solvent. Quantification was performed using the AQARI method (Accurate Quantitative NMR) with deuterated 1,4-Bis(trimethylsilyl)benzene (BTMSB) as an internal reference substances. This AQARI protocol includes not only NMR measurement but also sample preparation that provide highest precision and accuracy than other qNMR methods. The 90° pulse length and the T1 relaxation times for compounds and BTMSB were determined prior to the quantification to give the best signal-to-noise ratio. Regions containing the two downfield signals from aromatic part (6.00–6.89 ppm), and the singlet signal, (18H) arising from BTMSB (0.63-1.05ppm) were selected for integration. The purity of clitorin and manghaslin were calculated to be 52.22% and 43.36%, respectively. Further purification is needed in order to increase its purity. This finding has demonstrated the use of qNMR for quality control and standardization of various plant extracts and which can be applied for NMR fingerprinting of other plant-based products with good reproducibility and in the case where commercial standards is not readily available.Keywords: Carica papaya, clitorin, manghaslin, quantitative Nuclear Magnetic Resonance, Centrifugal Partition Chromatography
Procedia PDF Downloads 496688 Review on Recent Dynamics and Constraints of Affordable Housing Provision in Nigeria: A Case of Growing Economic Precarity
Authors: Ikenna Stephen Ezennia, Sebnem Onal Hoscara
Abstract:
Successive governments in Nigeria are faced with the pressing problem of how to house an ever-expanding urban population, usually low-income earners. The question of housing and affordability presents a complex challenge for these governments, as the commodification of housing links it inextricably to markets and capital flows. Therefore, placing it as at the center of the government’s agenda. However, the provision of decent and affordable housing for average Nigerians has remained an illusion, despite copious schemes, policies and programs initiated and carried out by various successive governments. Similarly, this phenomenon has also been observed in many countries of Africa, which is largely a result of economic unpredictability, lack of housing finance and insecurity, among other factors peculiar to a struggling economy. This study reviews recent dynamics and factors challenging the provision and development of affordable housing for the low income urban populace of Nigeria. Thus, the aim of the study is to present a comprehensive approach for understanding recent trends in the provision of affordable housing for Nigerians. The approach is based on a new paradigm of research: transdisciplinarity; a form of inquiry that crosses the boundaries of different disciplines. Therefore, the review takes a retrospective gaze at the various housing development programs/schemes/policies taken by successive governments of Nigeria within the last few decades and exams recent efforts geared towards eradicating the problems of housing delivery. Sources of data included relevant English language articles and the results of literature search of Elsevier Science Direct, ISI Web of Knowledge, Pro Quest Central, Scopus, and Google Scholar. The findings reveal that factors such as; rapid urbanization, inadequate planning and land use control, lack of adequate and favorable finance, high prices of land, high prices of building material, youth/touts harassment of developers, poor urban infrastructure, multiple taxation, and risk share are the major factors posing as a hindrance to adequate housing delivery. The results show that the majority of Nigeria’s affordable housing schemes, programs and policies are in most cases poorly implemented and abandoned without proper coordination. Consequently, the study concludes that the affordable housing delivery strategies in Nigeria are an epitome of lip service politics by successive governments; and the current trend of leaving housing provision to the vagaries of market forces cannot be expected to support affordable housing especially for the low income urban populace.Keywords: affordable housing, housing delivery, national housing policy, urban poor
Procedia PDF Downloads 220687 Implications of Agricultural Subsidies Since Green Revolution: A Case Study of Indian Punjab
Authors: Kriti Jain, Sucha Singh Gill
Abstract:
Subsidies have been a major part of agricultural policies around the world, and more extensively since the green revolution in developing countries, for the sake of attaining higher agricultural productivity and achieving food security. But entrenched subsidies lead to distorted incentives and promote inefficiencies in the agricultural sector, threatening the viability of these very subsidies and sustainability of the agricultural production systems, posing a threat to the livelihood of farmers and laborers dependent on it. This paper analyzes the economic and ecological sustainability implications of prolonged input and output subsidies in agriculture by studying the case of Indian Punjab, an agriculturally developed state responsible for ensuring food security in the country when it was facing a major food crisis. The paper focuses specifically on the environmentally unsustainable cropping pattern changes as a result of Minimum Support Price (MSP) and assured procurement and on the resource use efficiency and cost implications of power subsidy for irrigation in Punjab. The study is based on an analysis of both secondary and primary data sources. Using secondary data, a time series analysis was done to capture the changes in Punjab’s cropping pattern, water table depth, fertilizer consumption, and electrification of agriculture. This has been done to examine the role of price and output support adopted to encourage the adoption of green revolution technology in changing the cropping structure of the state, resulting in increased input use intensities (especially groundwater and fertilizers), which harms the ecological balance and decreases factor productivity. Evaluation of electrification of Punjab agriculture helped evaluate the trend in electricity productivity of agriculture and how free power imposed further pressure on the extant agricultural ecosystem. Using data collected from a primary survey of 320 farmers in Punjab, the extent of wasteful application of groundwater irrigation, water productivity of output, electricity usage, and cost of irrigation driven electricity subsidy to the exchequer were estimated for the dominant cropping pattern amongst farmers. The main findings of the study revealed how because of a subsidy has driven agricultural framework, Punjab has lost area under agro climatically suitable and staple crops and moved towards a paddy-wheat cropping system, that is gnawing away the state’s natural resources like water table has been declining at a significant rate of 25 cms per year since 1975-76, and excessive and imbalanced fertilizer usage has led to declining soil fertility in the state. With electricity-driven tubewells as the major source of irrigation within a regime of free electricity and water-intensive crop cultivation, there is both wasteful application of irrigation water and electricity in the cultivation of paddy crops, burning an unproductive hole in the exchequer’s pocket. There is limited access to both agricultural extension services and water-conserving technology, along with policy imbalance, keeping farmers in an intensive and unsustainable production system. Punjab agriculture is witnessing diminishing returns to factor, which under the business-as-usual scenario, will soon enter the phase of negative returns to factor.Keywords: cropping pattern, electrification, subsidy, sustainability
Procedia PDF Downloads 186686 Smart Laboratory for Clean Rivers in India - An Indo-Danish Collaboration
Authors: Nikhilesh Singh, Shishir Gaur, Anitha K. Sharma
Abstract:
Climate change and anthropogenic stress have severely affected ecosystems all over the globe. Indian rivers are under immense pressure, facing challenges like pollution, encroachment, extreme fluctuation in the flow regime, local ignorance and lack of coordination between stakeholders. To counter all these issues a holistic river rejuvenation plan is needed that tests, innovates and implements sustainable solutions in the river space for sustainable river management. Smart Laboratory for Clean Rivers (SLCR) an Indo-Danish collaboration project, provides a living lab setup that brings all the stakeholders (government agencies, academic and industrial partners and locals) together to engage, learn, co-creating and experiment for a clean and sustainable river that last for ages. Just like every mega project requires piloting, SLCR has opted for a small catchment of the Varuna River, located in the Middle Ganga Basin in India. Considering the integrated approach of river rejuvenation, SLCR embraces various techniques and upgrades for rejuvenation. Likely, maintaining flow in the channel in the lean period, Managed Aquifer Recharge (MAR) is a proven technology. In SLCR, Floa-TEM high-resolution lithological data is used in MAR models to have better decision-making for MAR structures nearby of the river to enhance the river aquifer exchanges. Furthermore, the concerns of quality in the river are a big issue. A city like Varanasi which is located in the last stretch of the river, generates almost 260 MLD of domestic waste in the catchment. The existing STP system is working at full efficiency. Instead of installing a new STP for the future, SLCR is upgrading those STPs with an IoT-based system that optimizes according to the nutrient load and energy consumption. SLCR also advocate nature-based solutions like a reed bed for the drains having less flow. In search of micropollutants, SLCR uses fingerprint analysis involves employing advanced techniques like chromatography and mass spectrometry to create unique chemical profiles. However, rejuvenation attempts cannot be possible without involving the entire catchment. A holistic water management plan that includes storm management, water harvesting structure to efficiently manage the flow of water in the catchment and installation of several buffer zones to restrict pollutants entering into the river. Similarly, carbon (emission and sequestration) is also an important parameter for the catchment. By adopting eco-friendly practices, a ripple effect positively influences the catchment's water dynamics and aids in the revival of river systems. SLCR has adopted 4 villages to make them carbon-neutral and water-positive. Moreover, for the 24×7 monitoring of the river and the catchment, robust IoT devices are going to be installed to observe, river and groundwater quality, groundwater level, river discharge and carbon emission in the catchment and ultimately provide fuel for the data analytics. In its completion, SLCR will provide a river restoration manual, which will strategise the detailed plan and way of implementation for stakeholders. Lastly, the entire process is planned in such a way that will be managed by local administrations and stakeholders equipped with capacity-building activity. This holistic approach makes SLCR unique in the field of river rejuvenation.Keywords: sustainable management, holistic approach, living lab, integrated river management
Procedia PDF Downloads 59685 (Re)Processing of ND-Fe-B Permanent Magnets Using Electrochemical and Physical Approaches
Authors: Kristina Zuzek, Xuan Xu, Awais Ikram, Richard Sheridan, Allan Walton, Saso Sturm
Abstract:
Recycling of end-of-life REEs based Nd-Fe-B magnets is an important strategy for reducing the environmental dangers associated with rare-earth mining and overcoming the well-documented supply risks related to the REEs. However, challenges on their reprocessing still remain. We report on the possibility of direct electrochemical recycling and reprocessing of Nd-Fe(B)-based magnets. In this investigation, we were able first to electrochemically leach the end-of-life NdFeB magnet and to electrodeposit Nd–Fe using a 1-ethyl-3-methyl imidazolium dicyanamide ([EMIM][DCA]) ionic liquid-based electrolyte. We observed that Nd(III) could not be reduced independently. However, it can be co-deposited on a substrate with the addition of Fe(II). Using advanced TEM techniques of electron-energy-loss spectroscopy (EELS) it was shown that Nd(III) is reduced to Nd(0) during the electrodeposition process. This gave a new insight into determining the Nd oxidation state, as X-ray photoelectron spectroscopy (XPS) has certain limitations. This is because the binding energies of metallic Nd (Nd0) and neodymium oxide (Nd₂O₃) are very close, i. e., 980.5-981.5 eV and 981.7-982.3 eV, respectively, making it almost impossible to differentiate between the two states. These new insights into the electrodeposition process represent an important step closer to efficient recycling of rare piles of earth in metallic form at mild temperatures, thus providing an alternative to high-temperature molten-salt electrolysis and a step closer to deposit Nd-Fe-based magnetic materials. Further, we propose a new concept of recycling the sintered Nd-Fe-B magnets by direct recovering the 2:14:1 matrix phase. Via an electrochemical etching method, we are able to recover pure individual 2:14:1 grains that can be re-used for new types of magnet production. In the frame of physical reprocessing, we have successfully synthesized new magnets out of hydrogen (HDDR)-recycled stocks with a contemporary technique of pulsed electric current sintering (PECS). The optimal PECS conditions yielded fully dense Nd-Fe-B magnets with the coercivity Hc = 1060 kA/m, which was boosted to 1160 kA/m after the post-PECS thermal treatment. The Br and Hc were tackled further and increased applied pressures of 100 – 150 MPa resulted in Br = 1.01 T. We showed that with a fine tune of the PECS and post-annealing it is possible to revitalize the Nd-Fe-B end-of-life magnets. By applying advanced TEM, i.e. atomic-scale Z-contrast STEM combined with EDXS and EELS, the resulting magnetic properties were critically assessed against various types of structural and compositional discontinuities down to atomic-scale, which we believe control the microstructure evolution during the PECS processing route.Keywords: electrochemistry, Nd-Fe-B, pulsed electric current sintering, recycling, reprocessing
Procedia PDF Downloads 156684 Study on the Rapid Start-up and Functional Microorganisms of the Coupled Process of Short-range Nitrification and Anammox in Landfill Leachate Treatment
Authors: Lina Wu
Abstract:
The excessive discharge of nitrogen in sewage greatly intensifies the eutrophication of water bodies and poses a threat to water quality. Nitrogen pollution control has become a global concern. Currently, the problem of water pollution in China is still not optimistic. As a typical high ammonia nitrogen organic wastewater, landfill leachate is more difficult to treat than domestic sewage because of its complex water quality, high toxicity, and high concentration.Many studies have shown that the autotrophic anammox bacteria in nature can combine nitrous and ammonia nitrogen without carbon source through functional genes to achieve total nitrogen removal, which is very suitable for the removal of nitrogen from leachate. In addition, the process also saves a lot of aeration energy consumption than the traditional nitrogen removal process. Therefore, anammox plays an important role in nitrogen conversion and energy saving. The process composed of short-range nitrification and denitrification coupled an ammo ensures the removal of total nitrogen and improves the removal efficiency, meeting the needs of the society for an ecologically friendly and cost-effective nutrient removal treatment technology. Continuous flow process for treating late leachate [an up-flow anaerobic sludge blanket reactor (UASB), anoxic/oxic (A/O)–anaerobic ammonia oxidation reactor (ANAOR or anammox reactor)] has been developed to achieve autotrophic deep nitrogen removal. In this process, the optimal process parameters such as hydraulic retention time and nitrification flow rate have been obtained, and have been applied to the rapid start-up and stable operation of the process system and high removal efficiency. Besides, finding the characteristics of microbial community during the start-up of anammox process system and analyzing its microbial ecological mechanism provide a basis for the enrichment of anammox microbial community under high environmental stress. One research developed partial nitrification-Anammox (PN/A) using an internal circulation (IC) system and a biological aerated filter (BAF) biofilm reactor (IBBR), where the amount of water treated is closer to that of landfill leachate. However, new high-throughput sequencing technology is still required to be utilized to analyze the changes of microbial diversity of this system, related functional genera and functional genes under optimal conditions, providing theoretical and further practical basis for the engineering application of novel anammox system in biogas slurry treatment and resource utilization.Keywords: nutrient removal and recovery, leachate, anammox, partial nitrification
Procedia PDF Downloads 51683 Mirna Expression Profile is Different in Human Amniotic Mesenchymal Stem Cells Isolated from Obese Respect to Normal Weight Women
Authors: Carmela Nardelli, Laura Iaffaldano, Valentina Capobianco, Antonietta Tafuto, Maddalena Ferrigno, Angela Capone, Giuseppe Maria Maruotti, Maddalena Raia, Rosa Di Noto, Luigi Del Vecchio, Pasquale Martinelli, Lucio Pastore, Lucia Sacchetti
Abstract:
Maternal obesity and nutrient excess in utero increase the risk of future metabolic diseases in the adult life. The mechanisms underlying this process are probably based on genetic, epigenetic alterations and changes in foetal nutrient supply. In mammals, the placenta is the main interface between foetus and mother, it regulates intrauterine development, modulates adaptive responses to sub optimal in uterus conditions and it is also an important source of human amniotic mesenchymal stem cells (hA-MSCs). We previously highlighted a specific microRNA (miRNA) profiling in amnion from obese (Ob) pregnant women, here we compared the miRNA expression profile of hA-MSCs isolated from (Ob) and control (Co) women, aimed to search for any alterations in metabolic pathways that could predispose the new-born to the obese phenotype. Methods: We isolated, at delivery, hA-MSCs from amnion of 16 Ob- and 7 Co-women with pre-pregnancy body mass index (mean/SEM) 40.3/1.8 and 22.4/1.0 kg/m2, respectively. hA-MSCs were phenotyped by flow cytometry. Globally, 384 miRNAs were evaluated by the TaqMan Array Human MicroRNA Panel v 1.0 (Applied Biosystems). By the TargetScan program we selected the target genes of the miRNAs differently expressed in Ob- vs Co-hA-MSCs; further, by KEGG database, we selected the statistical significant biological pathways. Results: The immunophenotype characterization confirmed the mesenchymal origin of the isolated hA-MSCs. A large percentage of the tested miRNAs, about 61.4% (232/378), was expressed in hA-MSCs, whereas 38.6% (146/378) was not. Most of the expressed miRNAs (89.2%, 207/232) did not differ between Ob- and Co-hA-MSCs and were not further investigated. Conversely, 4.8% of miRNAs (11/232) was higher and 6.0% (14/232) was lower in Ob- vs Co-hA-MSCs. Interestingly, 7/232 miRNAs were obesity-specific, being expressed only in hA-MSCs isolated from obese women. Bioinformatics showed that these miRNAs significantly regulated (P<0.001) genes belonging to several metabolic pathways, i.e. MAPK signalling, actin cytoskeleton, focal adhesion, axon guidance, insulin signaling, etc. Conclusions: Our preliminary data highlight an altered miRNA profile in Ob- vs Co-hA-MSCs and suggest that an epigenetic miRNA-based mechanism of gene regulation could affect pathways involved in placental growth and function, thereby potentially increasing the newborn’s risk of metabolic diseases in the adult life.Keywords: hA-MSCs, obesity, miRNA, biosystem
Procedia PDF Downloads 528682 Between Leader-Member Exchange and Toxic Leadership: A Theoretical Review
Authors: Aldila Dyas Nurfitri
Abstract:
Nowadays, leadership has became the one of main issues in forming organization groups even countries. The concept of a social contract between the leaders and subordinates become one of the explanations for the leadership process. The interests of the two parties are not always the same, but they must work together to achieve both goals. Based on the concept at the previous it comes “The Leader Member Exchange Theory”—well known as LMX Theory, which assumes that leadership is a process of social interaction interplay between the leaders and their subordinates. High-quality LMX relationships characterized by a high carrying capacity, informal supervision, confidence, and power negotiation enabled, whereas low-quality LMX relationships are described by low support, large formal supervision, less or no participation of subordinates in decision-making, and less confidence as well as the attention of the leader Application of formal supervision system in a low LMX behavior was in line with strict controls on toxic leadership model. Leaders must be able to feel toxic control all aspects of the organization every time. Leaders with this leadership model does not give autonomy to the staff. This behavior causes stagnation and make a resistant organizational culture in an organization. In Indonesia, the pattern of toxic leadership later evolved into a dysfunctional system that is growing rapidly. One consequence is the emergence of corrupt behavior. According to Kellerman, corruption is defined as a pattern and some subordinates behave lie, cheat or steal to a degree that goes beyond the norm, they put self-interest than the common good.According to the corruption data in Indonesia based on the results of ICW research on 2012 showed that the local government sector ranked first with 177 cases. Followed by state or local enterprises as much as 41 cases. LMX is defined as the quality of the relationship between superiors and subordinates are implications for the effectiveness and progress of the organization. The assumption of this theory that leadership as a process of social interaction interplay between the leaders and his followers are characterized by a number of dimensions, such as affection, loyalty, contribution, and professional respect. Meanwhile, the toxic leadership is dysfunctional leadership in organization that is led by someone with the traits are not able to adjust, do not have integrity, malevolent, evil, and full of discontent marked by a number of characteristics, such as self-centeredness, exploiting others, controlling behavior, disrespecting others, suppress innovation and creativity of employees, and inadequate emotional intelligence. The leaders with some characteristics, such as high self-centeredness, exploiting others, controlling behavior, and disrespecting others, tends to describe a low LMX relationships directly with subordinates compared with low self-centeredness, exploiting others, controlling behavior, and disrespecting others. While suppress innovation and creativity of employees aspect and inadequate emotional intelligence, tend not to give direct effect to the low quality of LMX.Keywords: leader-member exchange, toxic leadership, leadership
Procedia PDF Downloads 487681 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains
Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe
Abstract:
The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.Keywords: digitalization, digital transformation, Industrie 4.0, lean production, value chain
Procedia PDF Downloads 313680 The Invaluable Contributions of Radiography and Radiotherapy in Modern Medicine
Authors: Sahar Heidary
Abstract:
Radiography and radiotherapy have emerged as crucial pillars of modern medical practice, revolutionizing diagnostics and treatment for a myriad of health conditions. This abstract highlights the pivotal role of radiography and radiotherapy in favor of healthcare and society. Radiography, a non-invasive imaging technique, has significantly advanced medical diagnostics by enabling the visualization of internal structures and abnormalities within the human body. With the advent of digital radiography, clinicians can obtain high-resolution images promptly, leading to faster diagnoses and informed treatment decisions. Radiography plays a pivotal role in detecting fractures, tumors, infections, and various other conditions, allowing for timely interventions and improved patient outcomes. Moreover, its widespread accessibility and cost-effectiveness make it an indispensable tool in healthcare settings worldwide. On the other hand, radiotherapy, a branch of medical science that utilizes high-energy radiation, has become an integral component of cancer treatment and management. By precisely targeting and damaging cancerous cells, radiotherapy offers a potent strategy to control tumor growth and, in many cases, leads to cancer eradication. Additionally, radiotherapy is often used in combination with surgery and chemotherapy, providing a multifaceted approach to combat cancer comprehensively. The continuous advancements in radiotherapy techniques, such as intensity-modulated radiotherapy and stereotactic radiosurgery, have further improved treatment precision while minimizing damage to surrounding healthy tissues. Furthermore, radiography and radiotherapy have demonstrated their worth beyond oncology. Radiography is instrumental in guiding various medical procedures, including catheter placement, joint injections, and dental evaluations, reducing complications and enhancing procedural accuracy. On the other hand, radiotherapy finds applications in non-cancerous conditions like benign tumors, vascular malformations, and certain neurological disorders, offering therapeutic options for patients who may not benefit from traditional surgical interventions. In conclusion, radiography and radiotherapy stand as indispensable tools in modern medicine, driving transformative improvements in patient care and treatment outcomes. Their ability to diagnose, treat, and manage a wide array of medical conditions underscores their favor in medical practice. As technology continues to advance, radiography and radiotherapy will undoubtedly play an ever more significant role in shaping the future of healthcare, ultimately saving lives and enhancing the quality of life for countless individuals worldwide.Keywords: radiology, radiotherapy, medical imaging, cancer treatment
Procedia PDF Downloads 69679 The Positive Impact of Wheelchair Service Provision on the Health and Overall Satisfaction of Wheelchair Users with the Devices
Authors: Archil Undilashvili, Ketevan Stvilia, Dustin Gilbreath, Giorgi Dzneladze, Gordon Charchward
Abstract:
Introduction: In recent years, diverse types of wheelchairs, both local production and imported, have been made available on the Georgian market for wheelchair users. Some types of wheelchairs are sold together with a service package, while the others, including the State Program, Supported locally-produced ones, don’t provide adjustment and maintenance service packages to users. Within the USAID Physical Rehabilitation Project in Georgia, a study was conducted to assess the impact of the wheelchair service provision in line with the WHO guidelines on the health and overall satisfaction of wheelchair users in Georgia. Methodology: A cross-sectional survey was conducted in May 2021. A structured questionnaire was used for telephone interviews that, along with socio-demographic characteristics, included questions for assessment of accessibility, availability, timeliness, cost and quality of wheelchair services received. Out of 1060 individuals listed in the census of wheelchair users, 752 were available for interview, with an actual response rate of 73.4%. 552 wheelchair users (31%) or their caregivers (69%) agreed to participate in the survey. In addition to using descriptive statistics, the study used multivariate matching of wheelchair users who received wheelchair services and who did not (control group). In addition, to evaluate satisfaction with service provision, respondents were asked to assess services. Findings: The majority (67%) of wheelchair users included in the survey were male. The average age of participants was 43. The three most frequently named reasons for using a wheelchair were cerebral palsy (29%), followed by stroke (18%), and amputation (12%). Users have had their current chair for four years on average. Overall, 60% of respondents reported that they were assessed before providing a wheelchair, but only half of them reported that their preferences and needs were considered. Only 13% of respondents had services in line with WHO guidelines and only 22% of wheelchair users had training when they received their current chair. 16% of participants said they had follow-up services, and 41% received adjustment services after receiving the chair. A slight majority (56%) of participants were satisfied with the quality of service provision and the service provision overall. Similarly, 55% were satisfied with the accessibility of service provision. A slightly larger majority (61%) were satisfied with the timeliness of service provision. The matching analysis suggests that users that received services in line with WHO guidelines were more satisfied with their chairs (the difference 17 point/0-100 scale) and they were four percentage points less likely to have health problems attributed to the chair. The regression analysis provides a similar finding of a 21 point increase in satisfaction attributable to services. Conclusion: The provision of wheelchair services in line with WHO guidelines and with follow-up services is likely to have a positive impact on the daily lives of wheelchair users in Georgia. Wheelchair services should be institutionalized as a standard component of wheelchair provision in Georgia.Keywords: physical rehabilitation, wheelchair users, persons with disabilities, wheelchair production
Procedia PDF Downloads 106678 Pond Site Diagnosis: Monoclonal Antibody-Based Farmer Level Tests to Detect the Acute Hepatopancreatic Necrosis Disease in Shrimp
Authors: B. T. Naveen Kumar, Anuj Tyagi, Niraj Kumar Singh, Visanu Boonyawiwat, A. H. Shanthanagouda, Orawan Boodde, K. M. Shankar, Prakash Patil, Shubhkaramjeet Kaur
Abstract:
Early mortality syndrome (EMS)/Acute Hepatopancreatic Necrosis Disease (AHPND) has emerged as a major obstacle for the shrimp farming around the world. It is caused by a strain of Vibrio parahaemolyticus. The possible preventive and control measure is, early and rapid detection of the pathogen in the broodstock, post-larvae and monitoring the shrimp during the culture period. Polymerase chain reaction (PCR) based early detection methods are good, but they are costly, time taking and requires a sophisticated laboratory. The present study was conducted to develop a simple, sensitive and rapid diagnostic farmer level kit for the reliable detection of AHPND in shrimp. A panel of monoclonal antibodies (MAbs) were raised against the recombinant Pir B protein (rPirB). First, an immunodot was developed by using MAbs G3B8 and Mab G3H2 which showed specific reactivity to purified r-PirB protein with no cross-reactivity to other shrimp bacterial pathogens (AHPND free Vibrio parahaemolyticus (Indian strains), V. anguillarum, WSSV, Aeromonas hydrophila, and Aphanomyces invadans). Immunodot developed using Mab G3B8 is more sensitive than that with the Mab G3H2. However, immunodot takes almost 2.5 hours to complete with several hands-on steps. Therefore, the flow-through assay (FTA) was developed by using a plastic cassette containing the nitrocellulose membrane with absorbing pads below. The sample was dotted in the test zone on the nitrocellulose membrane followed by continuos addition of five solutions in the order of i) blocking buffer (BSA) ii) primary antibody (MAb) iii) washing Solution iv) secondary antibody and v) chromogen substrate (TMB) clear purple dots against a white background were considered as positive reactions. The FTA developed using MAbG3B8 is more sensitive than that with MAb G3H2. In FTA the two MAbs showed specific reactivity to purified r-PirB protein and not to other shrimp bacterial pathogens. The FTA is simple to farmer/field level, sensitive and rapid requiring only 8-10 min for completion. Tests can be developed to kits, which will be ideal for use in biosecurity, for the first line of screening (at the port or pond site) and during monitoring and surveillance programmes overall for the good management practices to reduce the risk of the disease.Keywords: acute hepatopancreatic necrosis disease, AHPND, flow-through assay, FTA, farmer level, immunodot, pond site, shrimp
Procedia PDF Downloads 174677 Neoliberal Settler City: Socio-Spatial Segregation, Livelihood of Artists/Craftsmen in Delhi
Authors: Sophy Joseph
Abstract:
The study uses the concept of ‘Settler city’ to understand the nature of peripheralization that a neoliberal city initiates. The settler city designs powerless communities without inherent rights, title and sovereignty. Kathputli Colony, home to generations of artists/craftsmen, who have kept heritage of arts/crafts alive, has undergone eviction of its population from urban space. The proposed study, ‘Neoliberal Settler City: Socio-spatial segregation and livelihood of artists/craftsmen in Delhi’ would problematize the settler city as a colonial technology. The colonial regime has ‘erased’ the ‘unwanted’ as primitive and swept them to peripheries in the city. This study would also highlight how structural change in political economy has undermined their crafts/arts by depriving them from practicing/performing it with dignity in urban space. The interconnections between citizenship and In-Situ Private Public Partnership in Kathputli rehabilitation has become part of academic exercise. However, a comprehensive study connecting inherent characteristics of neoliberal settler city, trajectory of political economy of unorganized workers - artists/craftsmen and legal containment and exclusion leading to dispossession and marginalization of communities from the city site, is relevant to contextualize the trauma of spatial segregation. This study would deal with political, cultural, social and economic dominant behavior of the structure in the state formation, accumulation of property and design of urban space, fueled by segregation of marginalized/unorganized communities and disowning the ‘footloose proletariat’, the migrant workforce. The methodology of study involves qualitative research amongst communities and the field work-oral testimonies and personal accounts- becomes the primary material to theorize the realities. The secondary materials in the forms of archival materials about historical evolution of Delhi as a planned city from various archives, would be used. As the study also adopt ‘narrative approach’ in qualitative study, the life experiences of craftsmen/artists as performers and emotional trauma of losing their livelihood and space forms an important record to understand the instability and insecurity that marginalization and development attributes on urban poor. The study attempts to prove that though there was a change in political tradition from colonialism to constitutional democracy, new state still follows the policy of segregation and dispossession of the communities. It is this dispossession from the space, deprivation of livelihood and non-consultative process in rehabilitation that reflects the neoliberal approach of the state and also critical findings in the study. This study would entail critical spatial lens analyzing ethnographic and sociological data, representational practices and development debates to understand ‘urban otherization’ against craftsmen/artists. This seeks to develop a conceptual framework for understanding the resistance of communities against primitivity attached with them and to decolonize the city. This would help to contextualize the demand for declaring Kathputli Colony as ‘heritage artists village’. The conceptualization and contextualization would help to argue for right to city of the communities, collective rights to property, services and self-determination. The aspirations of the communities also help to draw normative orientation towards decolonization. It is important to study this site as part of the framework, ‘inclusive cities’ because cities are rarely noted as important sites of ‘community struggles’.Keywords: neoliberal settler city, socio-spatial segregation, the livelihood of artists/craftsmen, dispossession of indigenous communities, urban planning and cultural uprooting
Procedia PDF Downloads 130676 A Theragnostic Approach for Alzheimer’s Disease Focused on Phosphorylated Tau
Authors: Tomás Sobrino, Lara García-Varela, Marta Aramburu-Núñez, Mónica Castro, Noemí Gómez-Lado, Mariña Rodríguez-Arrizabalaga, Antía Custodia, Juan Manuel Pías-Peleteiro, José Manuel Aldrey, Daniel Romaus-Sanjurjo, Ángeles Almeida, Pablo Aguiar, Alberto Ouro
Abstract:
Introduction: Alzheimer’s disease (AD) and other tauopathies are primary causes of dementia, causing progressive cognitive deterioration that entails serious repercussions for the patients' performance of daily tasks. Currently, there is no effective approach for the early diagnosis and treatment of AD and tauopathies. This study suggests a theragnostic approach based on the importance of phosphorylated tau protein (p-Tau) in the early pathophysiological processes of AD. We have developed a novel theragnostic monoclonal antibody (mAb) to provide both diagnostic and therapeutic effects. Methods/Results: We have developed a p-Tau mAb, which was doped with deferoxamine for radiolabeling with Zirconium-89 (89Zr) for PET imaging, as well as fluorescence dies for immunofluorescence assays. The p-Tau mAb was evaluated in vitro for toxicity by MTT assay, LDH activity, propidium iodide/Annexin V assay, caspase-3, and mitochondrial membrane potential (MMP) assay in both mouse endothelial cell line (bEnd.3) and cortical primary neurons cell cultures. Importantly, non-toxic effects (up to concentrations of p-Tau mAb greater than 100 ug/mL) were detected. In vivo experiments in the tauopathy model mice (PS19) show that the 89Zr-pTau-mAb and 89Zr-Fragments-pTau-mAb are stable in circulation for up to 10 days without toxic effects. However, only less than 0.2% reached the brain, so further strategies have to be designed for crossing the Brain-Blood-Barrier (BBB). Moreover, an intraparenchymal treatment strategy was carried out. The PS19 mice were operated to implement osmotic pumps (Alzet 1004) at two different times, at 4 and 7 months, to stimulate the controlled release for one month each of the B6 antibody or the IgG1 control antibody. We demonstrated that B6-treated mice maintained their motor and memory abilities significantly compared with IgG1 treatment. In addition, we observed a significant reduction in p-Tau deposits in the brain. Conclusions /Discussion: A theragnostic pTau-mAb was developed. Moreover, we demonstrated that our p-Tau mAb recognizes very-early pathology forms of p-Tau by non-invasive techniques, such as PET. In addition, p-Tau mAb has non-toxic effects, both in vitro and in vivo. Although the p-Tau mAb is stable in circulation, only 0.2% achieve the brain. However, direct intraventricular treatment significantly reduces cognitive impairment in Alzheimer's animal models, as well as the accumulation of toxic p-Tau species.Keywords: alzheimer's disease, theragnosis, tau, PET, immunotherapy, tauopathies
Procedia PDF Downloads 70675 The Effect of Different Strength Training Methods on Muscle Strength, Body Composition and Factors Affecting Endurance Performance
Authors: Shaher A. I. Shalfawi, Fredrik Hviding, Bjornar Kjellstadli
Abstract:
The main purpose of this study was to measure the effect of two different strength training methods on muscle strength, muscle mass, fat mass and endurance factors. Fourteen physical education students accepted to participate in this study. The participants were then randomly divided into three groups, traditional training group (TTG), cluster training group (CTG) and control group (CG). TTG consisted of 4 participants aged ( ± SD) (22.3 ± 1.5 years), body mass (79.2 ± 15.4 kg) and height (178.3 ± 11.9 cm). CTG consisted of 5 participants aged (22.2 ± 3.5 years), body mass (81.0 ± 24.0 kg) and height (180.2 ± 12.3 cm). CG consisted of 5 participants aged (22 ± 2.8 years), body mass (77 ± 19 kg) and height (174 ± 6.7 cm). The participants underwent a hypertrophy strength training program twice a week consisting of 4 sets of 10 reps at 70% of one-repetition maximum (1RM), using barbell squat and barbell bench press for 8 weeks. The CTG performed 2 x 5 reps using 10 s recovery in between repetitions and 50 s recovery between sets, while TTG performed 4 sets of 10 reps with 90 s recovery in between sets. Pre- and post-tests were administrated to assess body composition (weight, muscle mass, and fat mass), 1RM (bench press and barbell squat) and a laboratory endurance test (Bruce Protocol). Instruments used to collect the data were Tanita BC-601 scale (Tanita, Illinois, USA), Woodway treadmill (Woodway, Wisconsin, USA) and Vyntus CPX breath-to-breath system (Jaeger, Hoechberg, Germany). Analysis was conducted at all measured variables including time to peak VO2, peak VO2, heart rate (HR) at peak VO2, respiratory exchange ratio (RER) at peak VO2, and number of breaths per minute. The results indicate an increase in 1RM performance after 8 weeks of training. The change in 1RM squat was for the TTG = 30 ± 3.8 kg, CTG = 28.6 ± 8.3 kg and CG = 10.3 ± 13.8 kg. Similarly, the change in 1RM bench press was for the TTG = 9.8 ± 2.8 kg, CTG = 7.4 ± 3.4 kg and CG = 4.4 ± 3.4 kg. The within-group analysis from the oxygen consumption measured during the incremental exercise indicated that the TTG had only a statistical significant increase in their RER from 1.16 ± 0.04 to 1.23 ± 0.05 (P < 0.05). The CTG had a statistical significant improvement in their HR at peak VO2 from 186 ± 24 to 191 ± 12 Beats Per Minute (P < 0.05) and their RER at peak VO2 from 1.11 ± 0.06 to 1.18 ±0.05 (P < 0.05). Finally, the CG had only a statistical significant increase in their RER at peak VO2 from 1.11 ± 0.07 to 1.21 ± 0.05 (P < 0.05). The between-group analysis showed no statistical differences between all groups in all the measured variables from the oxygen consumption test during the incremental exercise including changes in muscle mass, fat mass, and weight (kg). The results indicate a similar effect of hypertrophy strength training irrespective of the methods of the training used on untrained subjects. Because there were no notable changes in body-composition measures, the results suggest that the improvements in performance observed in all groups is most probably due to neuro-muscular adaptation to training.Keywords: hypertrophy strength training, cluster set, Bruce protocol, peak VO2
Procedia PDF Downloads 250674 Case Report: A Case of Confusion with Review of Sedative-Hypnotic Alprazolam Use
Authors: Agnes Simone
Abstract:
A 52-year-old male with unknown psychiatric and medical history was brought to the Psychiatric Emergency Room by ambulance directly from jail. He had been detained for three weeks for possession of a firearm while intoxicated. On initial evaluation, the patient was unable to provide a reliable history. He presented with odd jerking movements of his extremities and catatonic features, including mutism and stupor. His vital signs were stable. Patient was transferred to the medical emergency department for work-up of altered mental status. Due to suspicion for opioid overdose, the patient was given naloxone (Narcan) with no improvement. Laboratory work-up included complete blood count, comprehensive metabolic panel, thyroid stimulating hormone, vitamin B12, folate, magnesium, rapid plasma reagin, HIV, blood alcohol level, aspirin, and Tylenol blood levels, urine drug screen, and urinalysis, which were all negative. CT head and chest X-Ray were also negative. With this negative work-up, the medical team concluded there was no organic etiology and requested inpatient psychiatric admission. Upon re-evaluation by psychiatry, it was evident that the patient continued to have an altered mental status. Of note, the medical team did not include substance withdrawal in the differential diagnosis due to stable vital signs and a negative urine drug screen. The psychiatry team decided to check California's prescription drug monitoring program (CURES) and discovered that the patient was prescribed benzodiazepine alprazolam (Xanax) 2mg BID, a sedative-hypnotic, and hydrocodone/acetaminophen 10mg/325mg (Norco) QID, an opioid. After a thorough chart review, his daughter's contact information was found, and she confirmed his benzodiazepine and opioid use, with recent escalation and misuse. It was determined that the patient was experiencing alprazolam withdrawal, given this collateral information, his current symptoms, negative urine drug screen, and recent abrupt discontinuation of medications while incarcerated. After admission to the medical unit and two doses of alprazolam 2mg, the patient's mental status, alertness, and orientation improved, but he had no memory of the events that led to his hospitalization. He was discharged with a limited supply of alprazolam and a close follow-up to arrange a taper. Accompanying this case report, a qualitative review of presentations with alprazolam withdrawal was completed. This case and the review highlights: (1) Alprazolam withdrawal can occur at low doses and within just one week of use. (2) Alprazolam withdrawal can present without any vital sign instability. (3) Alprazolam withdrawal does not respond to short-acting benzodiazepines but does respond to certain long-acting benzodiazepines due to its unique chemical structure. (4) Alprazolam withdrawal is distinct from and more severe than other benzodiazepine withdrawals. This case highlights (1) the importance of physician utilization of drug-monitoring programs. This case, in particular, relied on California's drug monitoring program. (2) The importance of obtaining collateral information, especially in cases in which the patient is unable to provide a reliable history. (3) The importance of including substance intoxication and withdrawal in the differential diagnosis even when there is a negative urine drug screen. Toxidrome of withdrawal can be delayed. (4) The importance of discussing addiction and withdrawal risks of medications with patients.Keywords: addiction risk of benzodiazepines, alprazolam withdrawal, altered mental status, benzodiazepines, drug monitoring programs, sedative-hypnotics, substance use disorder
Procedia PDF Downloads 138673 A Paradigm Shift in the Cost of Illness of Type 2 Diabetes Mellitus over a Decade in South India: A Prevalence Based Study
Authors: Usha S. Adiga, Sachidanada Adiga
Abstract:
Introduction: Diabetes Mellitus (DM) is one of the most common non-communicable diseases which imposes a large economic burden on the global health-care system. Cost of illness studies in India have assessed the health care cost of DM, but have certain limitations due to lack of standardization of the methods used, improper documentation of data, lack of follow up, etc. The objective of the study was to estimate the cost of illness of uncomplicated versus complicated type 2 diabetes mellitus in Coastal Karnataka, India. The study also aimed to find out the trend of cost of illness of the disease over a decade. Methodology: A prevalence based bottom-up approach study was carried out in two tertiary care hospitals located in Coastal Karnataka after ethical approval. Direct Medical costs like annual laboratory costs, pharmacy cost, consultation charges, hospital bed charges, surgical /intervention costs of 238 diabetics and 340 diabetic patients respectively from two hospitals were obtained from the medical record sections. Patients were divided into six groups, uncomplicated diabetes, diabetic retinopathy(DR), nephropathy(DN), neuropathy(DNeu), diabetic foot(DF), and ischemic heart disease (IHD). Different costs incurred in 2008 and 2017 in these groups were compared, to study the trend of cost of illness. Kruskal Wallis test followed by Dunn’s test were used to compare median costs between the groups and Spearman's correlation test was used for correlation studies. Results: Uncomplicated patients had significantly lower costs (p <0.0001) compared to other groups. Patients with IHD had highest Medical expenses (p < 0.0001), followed by DN and DF (p < 0.0001 ). Annual medical costs incurred were 1.8, 2.76, 2.77, 1.76, and 4.34 times higher in retinopathy, nephropathy, diabetic foot, neuropathy and IHD patients as compared to the cost incurred in managing uncomplicated diabetics. Other costs also showed a similar pattern of rising. A positive correlation was observed between the costs incurred and duration of diabetes, a negative correlation between the glycemic status and cost incurred. The cost incurred in the management of DM in 2017 was found to be elevated 1.4 - 2.7 times when compared to that in 2008. Conclusion: It is evident from the study that the economic burden due to diabetes mellitus is substantial. It poses a significant financial burden on the healthcare system, individual and society as a whole. There is a need for the strategies to achieve optimal glycemic control and operationalize regular and early screening methods for complications so as to reduce the burden of the disease.Keywords: COI, diabetes mellitus, a bottom up approach, economics
Procedia PDF Downloads 116672 Neural Synchronization - The Brain’s Transfer of Sensory Data
Authors: David Edgar
Abstract:
To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)
Procedia PDF Downloads 126671 Perception of Nurses and Caregivers on Fall Preventive Management for Hospitalized Children Based on Ecological Model
Authors: Mirim Kim, Won-Oak Oh
Abstract:
Purpose: The purpose of this study was to identify hospitalized children's fall risk factors, fall prevention status and fall prevention strategies recognized by nurses and caregivers of hospitalized children and present an ecological model for fall preventive management in hospitalized children. Method: The participants of this study were 14 nurses working in medical institutions and having more than one year of child care experience and 14 adult caregivers of children under 6 years of age receiving inpatient treatment at a medical institution. One to one interview was attempted to identify their perception of fall preventive management. Transcribed data were analyzed through latent content analysis method. Results: Fall risk factors in hospitalized children were 'unpredictable behavior', 'instability', 'lack of awareness about danger', 'lack of awareness about falls', 'lack of child control ability', 'lack of awareness about the importance of fall prevention', 'lack of sensitivity to children', 'untidy environment around children', 'lack of personalized facilities for children', 'unsafe facility', 'lack of partnership between healthcare provider and caregiver', 'lack of human resources', 'inadequate fall prevention policy', 'lack of promotion about fall prevention', 'a performanceism oriented culture'. Fall preventive management status of hospitalized children were 'absence of fall prevention capability', 'efforts not to fall', 'blocking fall risk situation', 'limit the scope of children's activity when there is no caregiver', 'encourage caregivers' fall prevention activities', 'creating a safe environment surrounding hospitalized children', 'special management for fall high risk children', 'mutual cooperation between healthcare providers and caregivers', 'implementation of fall prevention policy', 'providing guide signs about fall risk'. Fall preventive management strategies of hospitalized children were 'restrain dangerous behavior', 'inspiring awareness about fall', 'providing fall preventive education considering the child's eye level', 'efforts to become an active subject of fall prevention activities', 'providing customed fall prevention education', 'open communication between healthcare providers and caregivers', 'infrastructure and personnel management to create safe hospital environment', 'expansion fall prevention campaign', 'development and application of a valid fall assessment instrument', 'conversion of awareness about safety'. Conclusion: In this study, the ecological model of fall preventive management for hospitalized children reflects various factors that directly or indirectly affect the fall prevention of hospitalized children. Therefore, these results can be considered as useful baseline data for developing systematic fall prevention programs and hospital policies to prevent fall accident in hospitalized children. Funding: This study was funded by the National Research Foundation of South Korea (grant number NRF-2016R1A2B1015455).Keywords: fall down, safety culture, hospitalized children, risk factors
Procedia PDF Downloads 164670 Enabling Rather Than Managing: Organizational and Cultural Innovation Mechanisms in a Heterarchical Organization
Authors: Sarah M. Schoellhammer, Stephen Gibb
Abstract:
Bureaucracy, in particular, its core element, a formal and stable hierarchy of authority, is proving less and less appropriate under the conditions of today’s knowledge economy. Centralization and formalization were consistently found to hinder innovation, undermining cross-functional collaboration, personal responsibility, and flexibility. With its focus on systematical planning, controlling and monitoring the development of new or improved solutions for customers, even innovation management as a discipline is to a significant extent based on a mechanistic understanding of organizations. The most important drivers of innovation, human creativity, and initiative, however, can be more hindered than supported by central elements of classic innovation management, such as predefined innovation strategies, rigid stage gate processes, and decisions made in management gate meetings. Heterarchy, as an alternative network form of organization, is essentially characterized by its dynamic influence structures, whereby the biggest influence is allocated by the collective to the persons perceived the most competent in a certain issue. Theoretical arguments that the non-hierarchical concept better supports innovation than bureaucracy have been supported by empirical research. These prior studies either focus on the structure and general functioning of non-hierarchical organizations or on their innovativeness, that means innovation as an outcome. Complementing classic innovation management approaches, this work aims to shed light on how innovations are initiated and realized in heterarchies in order to identify alternative solutions practiced under conditions of the post-bureaucratic organization. Through an initial individual case study, which is part of a multiple-case project, the innovation practices of an innovative and highly heterarchical medium-sized company in the German fire engineering industry are investigated. In a pragmatic mixed methods approach media resonance, company documents, and workspace architecture are analyzed, in addition to qualitative interviews with the CEO and employees of the case company, as well as a quantitative survey aiming to characterize the company along five scaled dimensions of a heterarchy spectrum. The analysis reveals some similarities and striking differences to approaches suggested by classic innovation management. The studied heterarchy has no predefined innovation strategy guiding new product and service development. Instead, strategic direction is provided by the CEO, described as visionary and creative. Procedures for innovation are hardly formalized, with new product ideas being evaluated on the basis of gut feeling and flexible, rather general criteria. Employees still being hesitant to take responsibility and make decisions, hierarchical influence is still prominent. Described as open-minded and collaborative, culture and leadership were found largely congruent with definitions of innovation culture. Overall, innovation efforts at the case company tend to be coordinated more through cultural than through formal organizational mechanisms. To better enable innovation in mainstream organizations, responsible practitioners are recommended not to limit changes to reducing the central elements of the bureaucratic organization, formalization, and centralization. The freedoms this entails need to be sustained through cultural coordination mechanisms, with personal initiative and responsibility by employees as well as common innovation-supportive norms and values. These allow to integrate diverse competencies, opinions, and activities and, thus, to guide innovation efforts.Keywords: bureaucracy, heterarchy, innovation management, values
Procedia PDF Downloads 187669 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking
Authors: Trevor Toy, Josef Langerman
Abstract:
Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.Keywords: big data markets, open banking, blockchain, personal data management
Procedia PDF Downloads 73668 Analysis of Unconditional Conservatism and Earnings Quality before and after the IFRS Adoption
Authors: Monica Santi, Evita Puspitasari
Abstract:
International Financial Reporting Standard (IFRS) has developed the principle based accounting standard. Based on this, IASB then eliminated the conservatism concept within accounting framework. Conservatism concept represents a prudent reaction to uncertainty to try to ensure that uncertainties and risk inherent in business situations are adequately considered. The conservatism concept has two ingredients: conditional conservatism or ex-post (news depending prudence) and unconditional conservatism or ex-ante (news-independent prudence). IFRS in substance disregards the unconditional conservatism because the unconditional conservatism can cause the understatement assets or overstated liabilities, and eventually the financial statement would be irrelevance since the information does not represent the real fact. Therefore, the IASB eliminate the conservatism concept. However, it does not decrease the practice of unconditional conservatism in the financial statement reporting. Therefore, we expected the earnings quality would be affected because of this situation, even though the IFRS implementation was expected to increase the earnings quality. The objective of this study was to provide empirical findings about the unconditional conservatism and the earnings quality before and after the IFRS adoption. The earnings per accrual measure were used as the proxy for the unconditional conservatism. If the earnings per accrual were negative (positive), it meant the company was classified as the conservative (not conservative). The earnings quality was defined as the ability of the earnings in reflecting the future earnings by considering the earnings persistence and stability. We used the earnings response coefficient (ERC) as the proxy for the earnings quality. ERC measured the extant of a security’s abnormal market return in response to the unexpected component of reporting earning of the firm issuing that security. The higher ERC indicated the higher earnings quality. The manufacturing companies listed in the Indonesian Stock Exchange (IDX) were used as the sample companies, and the 2009-2010 period was used to represent the condition before the IFRS adoption, and 2011-2013 was used to represent the condition after the IFRS adoption. Data was analyzed using the Mann-Whitney test and regression analysis. We used the firm size as the control variable with the consideration the firm size would affect the earnings quality of the company. This study had proved that the unconditional conservatism had not changed, either before and after the IFRS adoption period. However, we found the different findings for the earnings quality. The earnings quality had decreased after the IFRS adoption period. This empirical results implied that the earnings quality before the IFRS adoption was higher. This study also had found that the unconditional conservatism positively influenced the earnings quality insignificantly. The findings implied that the implementation of the IFRS had not decreased the unconditional conservatism practice and has not altered the earnings quality of the manufacturing company. Further, we found that the unconditional conservatism did not affect the earnings quality. Eventhough the empirical result shows that the unconditional conservatism gave positive influence to the earnings quality, but the influence was not significant. Thus, we concluded that the implementation of the IFRS did not increase the earnings quality.Keywords: earnings quality, earnings response coefficient, IFRS Adoption, unconditional conservatism
Procedia PDF Downloads 258667 The Effects of Alpha-Lipoic Acid Supplementation on Post-Stroke Patients: A Systematic Review and Meta-Analysis of Randomized Controlled Trials
Authors: Hamid Abbasi, Neda Jourabchi, Ranasadat Abedi, Kiarash Tajernarenj, Mehdi Farhoudi, Sarvin Sanaie
Abstract:
Background: Alpha lipoic acid (ALA), fat- and water-soluble, coenzyme with sulfuret content, has received considerable attention for its potential therapeutic role in diabetes, cardiovascular diseases, cancers, and central nervous disease. This investigation aims to evaluate the probable protective effects of ALA in stroke patients. Methods: Based on Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines, This meta-analysis was performed. The PICO criteria for this meta-analysis were as follows: Population/Patients (P: stroke patients); Intervention (I: ALA); Comparison (C: control); Outcome (O: blood glucose, lipid profile, oxidative stress, inflammatory factors).In addition, Studies that were excluded from the analysis consisted of in vitro, in vivo, and ex vivo studies, case reports, quasi-experimental studies. Scopus, PubMed, Web of Science, EMBASE databases were searched until August 2023. Results: Of 496 records that were screened in the title/abstract stage, 9 studies were included in this meta-analysis. The sample sizes in the included studies vary between 28 and 90. The result of risk of bias was performed via risk of bias (RoB) in randomized-controlled trials (RCTs) based on the second version of the Cochrane RoB assessment tool. 8 studies had a definitely high risk of bias. Discussion: To the best of our knowledge, The present meta-analysis is the first study addressing the effectiveness of ALA supplementation in enhancing post-stroke metabolic markers, including lipid profile, oxidative stress, and inflammatory indices. It is imperative to acknowledge certain potential limitations inherent in this study. First of all, type of treatment (oral or intravenous infusion) could alter the bioavailability of ALA. Our study had restricted evidence regarding the impact of ALA supplementation on included outcomes. Therefore, further research is warranted to develop into the effects of ALA specifically on inflammation and oxidative stress. Funding: The research protocol was approved and supported by the Student Research Committee, Tabriz University of Medical Sciences (grant number: 72825). Registration: This study was registered in the International prospective register of systematic reviews (PROSPERO ID: CR42023461612).Keywords: alpha-lipoic acid, lipid profile, blood glucose, inflammatory factors, oxidative stress, meta-analysis, post-stroke
Procedia PDF Downloads 63666 Salmonella Emerging Serotypes in Northwestern Italy: Genetic Characterization by Pulsed-Field Gel Electrophoresis
Authors: Clara Tramuta, Floris Irene, Daniela Manila Bianchi, Monica Pitti, Giulia Federica Cazzaniga, Lucia Decastelli
Abstract:
This work presents the results obtained by the Regional Reference Centre for Salmonella Typing (CeRTiS) in a retrospective study aimed to investigate, through Pulsed-field Gel Electrophoresis (PFGE) analysis, the genetic relatedness of emerging Salmonella serotypes of human origin circulating in North-West of Italy. Furthermore, the goal of this work was to create a Regional database to facilitate foodborne outbreak investigation and to monitor them at an earlier stage. A total of 112 strains, isolated from 2016 to 2018 in hospital laboratories, were included in this study. The isolates were previously identified as Salmonella according to standard microbiological techniques and serotyping was performed according to ISO 6579-3 and the Kaufmann-White scheme using O and H antisera (Statens Serum Institut®). All strains were characterized by PFGE: analysis was conducted according to a standardized PulseNet protocol. The restriction enzyme XbaI was used to generate several distinguishable genomic fragments on the agarose gel. PFGE was performed on a CHEF Mapper system, separating large fragments and generating comparable genetic patterns. The agarose gel was then stained with GelRed® and photographed under ultraviolet transillumination. The PFGE patterns obtained from the 112 strains were compared using Bionumerics version 7.6 software with the Dice coefficient with 2% band tolerance and 2% optimization. For each serotype, the data obtained with the PFGE were compared according to the geographical origin and the year in which they were isolated. Salmonella strains were identified as follow: S. Derby n. 34; S. Infantis n. 38; S. Napoli n. 40. All the isolates had appreciable restricted digestion patterns ranging from approximately 40 to 1100 kb. In general, a fairly heterogeneous distribution of pulsotypes has emerged in the different provinces. Cluster analysis indicated high genetic similarity (≥ 83%) among strains of S. Derby (n. 30; 88%), S. Infantis (n. 36; 95%) and S. Napoli (n. 38; 95%) circulating in north-western Italy. The study underlines the genomic similarities shared by the emerging Salmonella strains in Northwest Italy and allowed to create a database to detect outbreaks in an early stage. Therefore, the results confirmed that PFGE is a powerful and discriminatory tool to investigate the genetic relationships among strains in order to monitoring and control Salmonellosis outbreak spread. Pulsed-field gel electrophoresis (PFGE) still represents one of the most suitable approaches to characterize strains, in particular for the laboratories for which NGS techniques are not available.Keywords: emerging Salmonella serotypes, genetic characterization, human strains, PFGE
Procedia PDF Downloads 105665 Predicting Blockchain Technology Installation Cost in Supply Chain System through Supervised Learning
Authors: Hossein Havaeji, Tony Wong, Thien-My Dao
Abstract:
1. Research Problems and Research Objectives: Blockchain Technology-enabled Supply Chain System (BT-enabled SCS) is the system using BT to drive SCS transparency, security, durability, and process integrity as SCS data is not always visible, available, or trusted. The costs of operating BT in the SCS are a common problem in several organizations. The costs must be estimated as they can impact existing cost control strategies. To account for system and deployment costs, it is necessary to overcome the following hurdle. The problem is that the costs of developing and running a BT in SCS are not yet clear in most cases. Many industries aiming to use BT have special attention to the importance of BT installation cost which has a direct impact on the total costs of SCS. Predicting BT installation cost in SCS may help managers decide whether BT is to be an economic advantage. The purpose of the research is to identify some main BT installation cost components in SCS needed for deeper cost analysis. We then identify and categorize the main groups of cost components in more detail to utilize them in the prediction process. The second objective is to determine the suitable Supervised Learning technique in order to predict the costs of developing and running BT in SCS in a particular case study. The last aim is to investigate how the running BT cost can be involved in the total cost of SCS. 2. Work Performed: Applied successfully in various fields, Supervised Learning is a method to set the data frame, treat the data, and train/practice the method sort. It is a learning model directed to make predictions of an outcome measurement based on a set of unforeseen input data. The following steps must be conducted to search for the objectives of our subject. The first step is to make a literature review to identify the different cost components of BT installation in SCS. Based on the literature review, we should choose some Supervised Learning methods which are suitable for BT installation cost prediction in SCS. According to the literature review, some Supervised Learning algorithms which provide us with a powerful tool to classify BT installation components and predict BT installation cost are the Support Vector Regression (SVR) algorithm, Back Propagation (BP) neural network, and Artificial Neural Network (ANN). Choosing a case study to feed data into the models comes into the third step. Finally, we will propose the best predictive performance to find the minimum BT installation costs in SCS. 3. Expected Results and Conclusion: This study tends to propose a cost prediction of BT installation in SCS with the help of Supervised Learning algorithms. At first attempt, we will select a case study in the field of BT-enabled SCS, and then use some Supervised Learning algorithms to predict BT installation cost in SCS. We continue to find the best predictive performance for developing and running BT in SCS. Finally, the paper will be presented at the conference.Keywords: blockchain technology, blockchain technology-enabled supply chain system, installation cost, supervised learning
Procedia PDF Downloads 122664 Infrared Spectroscopy Fingerprinting of Herbal Products- Application of the Hypericum perforatum L. Supplements
Authors: Elena Iacob, Marie-Louise Ionescu, Elena Ionescu, Carmen Elena Tebrencu, Oana Teodora Ciuperca
Abstract:
Infrared spectroscopy (FT-IR) is an advanced technique frequently used to authenticate both raw materials and final products using their specific fingerprints and to determine plant extracts biomarkers based on their functional groups. In recent years the market for Hypericum has grown rapidly and also has grown the cases of adultery/replacement, especially for Hypericum perforatum L.specie. Presence/absence of same biomarkers provides preliminary identification of Hypericum species in safe use in the manufacture of food supplements. The main objective of the work was to characterize the main biomarkers of Hypericum perforatum L. (St. John's wort) and identify this species in herbal food supplements after specific FT-IR fingerprint. An experimental program has been designed in order to test: (1) raw material (St. John's wort); (2)intermediate raw materials (St. John's wort dry extract ); (3) the finished products: tablets based on powders, on extracts, on powder and extract, hydroalcoholic solution from herbal mixture based on St. John's wort. The analyze using FTIR infrared spectroscopy were obtained raw materials, intermediates and finished products spectra, respectively absorption bands corresponding and similar with aliphatic and aromatic structures; examination was done individually and through comparison between Hypericum perforatum L. plant species and finished product The tests were done in correlation with phytochemical markers for authenticating the specie Hypericum perforatum L.: hyperoside, rutin, quercetin, isoquercetin, luteolin, apigenin, hypericin, hyperforin, chlorogenic acid. Samples were analyzed using a Shimatzu FTIR spectrometer and the infrared spectrum of each sample was recorded in the MIR region, from 4000 to 1000 cm-1 and then the fingerprint region was selected for data analysis. The following functional groups were identified -stretching vibrations suggests existing groups in the compounds of interest (flavones–rutin, hyperoside, polyphenolcarboxilic acids - chlorogenic acid, naphtodianthrones- hypericin): oxidril groups (OH) free alcohol type: rutin, hyperoside, chlorogenic acid; C = O bond from structures with free carbonyl groups of aldehyde, ketone, carboxylic, ester: hypericin; C = O structure with the free carbonyl of the aldehyde groups, ketone, carboxylic acid, esteric/C = O free bonds present in chlorogenic acid; C = C bonds of the aromatic ring (condensed aromatic hydrocarbons, heterocyclic compounds) present in all compounds of interest; OH phenolic groups: present in all compounds of interest, C-O-C groups from glycoside structures: rutin, hyperoside, chlorogenic acid. The experimental results show that: (I)The six fingerprint region analysis indicated the presence of specific functional groups: (1) 1000 - 1130 cm-1 (C-O–C of glycoside structures); (2) 1200-1380 cm-1 (carbonyl C-O or O-H phenolic); (3) 1400-1450 cm-1 (C=C aromatic); (4) 1600- 1730 cm-1 (C=O carbonyl); (5) 2850 - 2930 cm-1 (–CH3, -CH2-, =CH-); (6) 338-3920 cm-1 (OH free alcohol type); (II)Comparative FT-IR spectral analysis indicate the authenticity of the finished products ( tablets) in terms of Hypericum perforatum L. content; (III)The infrared spectroscopy is an adequate technique for identification and authentication of the medicinal herbs , intermediate raw material and in the food supplements less in the form of solutions where the results are not conclusive.Keywords: Authentication, FT-IR fingerprint, Herbal supplements, Hypericum perforatum L.
Procedia PDF Downloads 374663 The Investigation of Effect of Alpha Lipoic Acid against Damage on Neonatal Rat Lung to Maternal Tobacco Smoke Exposure
Authors: Elif Erdem, Nalan Kaya, Gonca Ozan, Durrin Ozlem Dabak, Enver Ozan
Abstract:
This study was carried out to determine the histological and biochemical changes in the lungs of the rat pups exposed to tobacco smoke during pregnancy period and to investigate the protective effects of alpha lipoic acid, which is administered during pregnancy, on these changes. In our study, 24 six-week old Spraque-Dawley female rats weighing 160 ± 10 g were used (n:7). Rats were randomly divided into four equal groups: group I (control), group II (tobacco smoke), group III (tobacco smoke + alpha lipoic acid) and group IV (alpha lipoic acid). Rats in the group II, group III were exposed to tobacco smoke twice a day for one hour starting from eight weeks before mating and during pregnancy. In addition to tobacco smoke, 20 mg/kg of alpha lipoic acid was administered via oral gavage to the rats in the group III. Only alpha lipoic acid was administered to the rats in the group IV. Once after the delivery, all administrations were stopped. On the 7 and 21th days, the seven pups of all groups were decapitated. A portion of the lung was taken and stained with HE, PAS and Masson. In addition to immunohistochemical staining of surfactant protein A, vascular endothelial growth factor, caspase-3, TUNEL method was also used to determine apoptosis. Biochemical analyzes were performed with some part of the lung tissue specimens. In the histological evaluations performed under light microscopy, inflammatory cell increase, hemorrhagic areas, edema, interalveolar septal thickening, alveolar numbers decrease, degeneration of some bronchi and bronchial epithelium, epithelial cells that were fallen into the lumen and hyaline membrane formation were observed in tobacco smoke group. These findings were ameliorated in tobacco smoke + ALA group. Hyaline membrane formation was not detected in this group. The TUNEL positive cell numbers a significant increase was detected in the tobacco smoke group, whereas a significant decrease was detected in the tobacco smoke + ALA group. In terms of the immunoreactivity of both SP-A and VEGF, a significant decrease was observed in the tobacco smoke group, and a significant increase was observed in the tobacco smoke + ALA group. Regarding the immunoreactivity of caspase-3, there was a significant increase in the group of tobacco smoke and a significant decrease in the group of tobacco smoke + ALA. The malondialdehyde levels were determined to be significantly increased in the tobacco smoke group, and a significant decreased in the tobacco smoke + ALA. Glutathione and superoxide dismutase enzyme activities showed a significant decrease in the group of tobacco smoke and a significant increase in the tobacco smoke + ALA group. In conclusion, we suggest that the exposure to tobacco smoke during pregnancy leads to morphological, histopathological and functional changes on lung development by causing oxidative damage in lung tissues of neonatal rats and the maternal use of alpha lipoic acid can provide a protective effect on the neonatal lung development against this oxidative stress originating from tobacco smoke.Keywords: alpha lipoic acid, lung, neonate, tobacco smoke, pregnancy
Procedia PDF Downloads 211