Search results for: extreme precipitation events
2603 Aspects of Diglossia in Arabic Language Learning
Authors: Adil Ishag
Abstract:
Diglossia emerges in a situation where two distinctive varieties of a language are used alongside within a certain community. In this case, one is considered as a high or standard variety and the second one as a low or colloquial variety. Arabic is an extreme example of a highly diglossic language. This diglossity is due to the fact that Arabic is one of the most spoken languages and spread over 22 Countries in two continents as a mother tongue, and it is also widely spoken in many other Islamic countries as a second language or simply the language of Quran. The geographical variation between the countries where the language is spoken and the duality of the classical Arabic and daily spoken dialects in the Arab world on the other hand; makes the Arabic language one of the most diglossic languages. This paper tries to investigate this phenomena and its relation to learning Arabic as a first and second language.Keywords: Arabic language, diglossia, first and second language, language learning
Procedia PDF Downloads 5642602 Comparison of Low Velocity Impact Test on Coir Fiber Reinforced Polyester Composites
Authors: Ricardo Mendoza, Jason Briceño, Juan F. Santa, Gabriel Peluffo, Mauricio Márquez, Beatriz Cardozo, Carlos Gutiérrez
Abstract:
The most common controlled method to obtain impact strength of composites materials is performing a Charpy Impact Test which consists of a pendulum with calibrated mass and length released from a known height. In fact, composites components experience impact events in normal operations such as when a tool drops or a foreign object strikes it. These events are categorized into low velocity impact (LVI) which typically occurs at velocities below 10m/s. In this study, the major aim was to calculate the absorbed energy during the impact. Tests were performed on three types of composite panels: fiberglass laminated panels, coir fiber reinforced polyester and coir fiber reinforced polyester subjected to water immersion for 48 hours. Coir fibers were obtained in local plantations of the Caribbean coast of Colombia. They were alkali treated in 5% aqueous NaOH solution for 2h periods. Three type of shape impactors were used on drop-weight impact test including hemispherical, ogive and pointed. Failure mechanisms and failure modes of specimens were examined using an optical microscope. Results demonstrate a reduction in absorbed energy correlated with the increment of water absorption of the panels. For each level of absorbed energy, it was possible to associate a different fracture state. This study compares results of energy absorbed obtained from two impact test methods.Keywords: coir fiber, polyester composites, low velocity impact, Charpy impact test, drop-weight impact test
Procedia PDF Downloads 4522601 Overcoming Obstacles in UHTHigh-protein Whey Beverages by Microparticulation Process: Scientific and Technological Aspects
Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh, Seyed Jalal Razavi Zahedkolaei
Abstract:
Herein, a shelf stable (no refrigeration required) UHT processed, aseptically packaged whey protein drink was formulated by using a new strategy in microparticulate process. Applying thermal and two-dimensional mechanical treatments simultaneously, a modified protein (MWPC-80) was produced. Then the physical, thermal and thermodynamic properties of MWPC-80 were assessed using particle size analysis, dynamic temperature sweep (DTS), and differential scanning calorimetric (DSC) tests. Finally, using MWPC-80, a new RTD beverage was formulated, and shelf stability was assessed for three months at ambient temperature (25 °C). Non-isothermal dynamic temperature sweep was performed, and the results were analyzed by a combination of classic rate equation, Arrhenius equation, and time-temperature relationship. Generally, results showed that temperature dependency of the modified sample was significantly (Pvalue<0.05) less than the control one contained WPC-80. The changes in elastic modulus of the MWPC did not show any critical point at all the processed stages, whereas, the control sample showed two critical points during heating (82.5 °C) and cooling (71.10 °C) stages. Thermal properties of samples (WPC-80 & MWPC-80) were assessed using DSC with 4 °C /min heating speed at 20-90 °C heating range. Results did not show any thermal peak in MWPC DSC curve, which suggested high thermal resistance. On the other hands, WPC-80 sample showed a significant thermal peak with thermodynamic properties of ∆G:942.52 Kj/mol ∆H:857.04 Kj/mole and ∆S:-1.22Kj/mole°K. Dynamic light scattering was performed and results showed 0.7 µm and 15 nm average particle size for MWPC-80 and WPC-80 samples, respectively. Moreover, particle size distribution of MWPC-80 and WPC-80 were Gaussian-Lutresian and normal, respectively. After verification of microparticulation process by DTS, PSD and DSC analyses, a 10% why protein beverage (10% w/w/ MWPC-80, 0.6% w/w vanilla flavoring agent, 0.1% masking flavor, 0.05% stevia natural sweetener and 0.25% citrate buffer) was formulated and UHT treatment was performed at 137 °C and 4 s. Shelf life study did not show any jellification or precipitation of MWPC-80 contained beverage during three months storage at ambient temperature, whereas, WPC-80 contained beverage showed significant precipitation and jellification after thermal processing, even at 3% w/w concentration. Consumer knowledge on nutritional advantages of whey protein increased the request for using this protein in different food systems especially RTD beverages. These results could make a huge difference in this industry.Keywords: high protein whey beverage, micropartiqulation, two-dimentional mechanical treatments, thermodynamic properties
Procedia PDF Downloads 732600 Investigating Salafism and Its Founder
Authors: Vahid Hosseinzadeh
Abstract:
Salafism is a movement of thought-religion that was born into Sunni Islam and Hanbali sect. However, many groups and different attitudes call themselves Salafis, but they all have common characteristics, the main of which is radical and retrograde interpretation of Islamic sources. Taqi Ad-Din Ahmad ibn Taymiyyah in the Muslim world was the first thinker who established these thoughts. The authors of this article initially tried to express the meaning of Salafism and its appellation in order to focus on the beliefs and thoughts of Ibn Taymiyyah. In this way, it was tried to extract the intellectual foundations of Ibn Taymiyya from the literature and scientific works of his own using a descriptive-analytical method. Extreme focus on the appearance of Quranic phrases and opposition to any new thing that did not exist in Qur'an, Sunnah and the first 3 centuries of Islam, are among the central feature of his thoughts.Keywords: Salafism, Ibn Taymiyyah, radical literalism, monotheism, polytheism, takfir
Procedia PDF Downloads 6202599 Patient Agitation and Violence in Medical-Surgical Settings at BronxCare Hospital, Before and During COVID-19 Pandemic; A Retrospective Chart Review
Authors: Soroush Pakniyat-Jahromi, Jessica Bucciarelli, Souparno Mitra, Neda Motamedi, Ralph Amazan, Samuel Rothman, Jose Tiburcio, Douglas Reich, Vicente Liz
Abstract:
Violence is defined as an act of physical force that is intended to cause harm and may lead to physical and/or psychological damage. Violence toward healthcare workers (HCWs) is more common in psychiatric settings, emergency departments, and nursing homes; however, healthcare workers in medical setting are not spared from such events. Workplace violence has a huge burden in the healthcare industry and has a major impact on the physical and mental wellbeing of staff. The purpose of this study is to compare the prevalence of patient agitation and violence in medical-surgical settings in BronxCare Hospital (BCH) Bronx, New York, one year before and during the COVID-19 pandemic. Data collection occurred between June 2021 and August 2021, while the sampling time was from 2019 to 2021. The data were separated into two separate time categories: pre-COVID-19 (03/2019-03/2020) and COVID-19 (03/2020-03/2021). We created frequency tables for 19 variables. We used a chi-square test to determine a variable's statistical significance. We tested all variables against “restraint type”, determining if a patient was violent or became violent enough to restrain. The restraint types were “chemical”, “physical”, or both. This analysis was also used to determine if there was a statistical difference between the pre-COVID-19 and COVID-19 timeframes. Our data shows that there was an increase in incidents of violence in COVID-19 era (03/2020-03/2021), with total of 194 (62.8%) reported events, compared to pre COVID-19 era (03/2019-03/2020) with 115 (37.2%) events (p: 0.01). Our final analysis, completed using a chi-square test, determined the difference in violence in patients between pre-COVID-19 and COVID-19 era. We then tested the violence marker against restraint type. The result was statistically significant (p: 0.01). This is the first paper to systematically review the prevalence of violence in medical-surgical units in a hospital in New York, pre COVID-19 and during the COVID-19 era. Our data is in line with the global trend of increased prevalence of patient agitation and violence in medical settings during the COVID-19 pandemic. Violence and its management is a challenge in healthcare settings, and the COVID-19 pandemic has brought to bear a complexity of circumstances, which may have increased its incidence. It is important to identify and teach healthcare workers the best preventive approaches in dealing with patient agitation, to decrease the number of restraints in medical settings, and to create a less restrictive environment to deliver care.Keywords: COVID-19 pandemic, patient agitation, restraints, violence
Procedia PDF Downloads 1432598 Easy Method of Synthesis and Functionalzation of Zno Nanoparticules With 3 Aminopropylthrimethoxysilane (APTES)
Authors: Haythem Barrak, Gaetan Laroche, Adel M’nif, Ahmed Hichem Hamzaoui
Abstract:
The use of semiconductor oxides, as chemical or biological, requires their functionalization with appropriate dependent molecules of the substance to be detected. generally, the support materials used are TiO2 and SiO2. In the present work, we used zinc oxide (ZnO) known for its interesting physical properties. The synthesis of nano scale ZnO was performed by co-precipitation at low temperature (60 ° C).To our knowledge, the obtaining of this material at this temperature was carried out for the first time. This shows the low cost of this operation. On the other hand, the surface functionalization of ZnO was performed with (3-aminopropyl) triethoxysilane (APTES) by using a specific method using ethanol for the first time. In addition, the duration of this stage is very low compared to literature. The samples obtained were analyzed by XRD, TEM, DLS, FTIR, and TGA shows that XPS that the operation of grafting of APTES on our support was carried out with success.Keywords: functionalization, nanoparticle, ZnO, APTES, caractérisation
Procedia PDF Downloads 3612597 Subtropical Potential Vorticity Intrusion Drives Increasing Tropospheric Ozone over the Tropical Central Pacific
Authors: Debashis Nath
Abstract:
Drawn from multiple reanalysis datasets, an increasing trend and westward shift in the number of Potential Vorticity (PV) intrusion events over the Pacific are evident. The increased frequency can be linked to a long-term trend in upper tropospheric (UT, 200 hPa) equatorial westerly wind and subtropical jets (STJ) during boreal winter to spring. These may be resulting from anomalous warming and cooling over the western Pacific warm pool and the tropical eastern Pacific, respectively. The intrusions brought dry and ozone rich air of stratospheric origin deep into the tropics. In the tropical UT, interannual ozone variability is mainly related to convection associated with El Niño/Southern Oscillation. Zonal mean stratospheric overturning circulation organizes the transport of ozone rich air poleward and downward to the high and midlatitudes leading there to higher ozone concentration. In addition to these well described mechanisms, we observe a long-term increasing trend in ozone flux over the northern hemispheric outer tropical (10–25°N) central Pacific that results from equatorward transport and downward mixing from the midlatitude UT and lower stratosphere (LS) during PV intrusions. This increase in tropospheric ozone flux over the Pacific Ocean may affect the radiative processes and changes the budget of atmospheric hydroxyl radicals. The results demonstrate a long-term increase in outer tropical Pacific PV intrusions linked with the strengthening of the upper tropospheric equatorial westerlies and weakening of the STJ. Zonal variation in SST, characterized by gradual warming in the western Pacific–warm pool and cooling in the central–eastern Pacific, is associated with the strengthening of the Pacific Walker circulation. In the Western Pacific enhanced convective activity leads to precipitation, and the latent heat released in the process strengthens the Pacific Walker circulation. However, it is linked with the trend in global mean temperature, which is related to the emerging anthropogenic greenhouse signal and negative phase of PDO. On the other hand, the central-eastern Pacific cooling trend is linked to the weakening of the central–eastern Pacific Hadley circulation. It suppresses the convective activity due to sinking air motion and imports less angular momentum to the STJ leading to a weakened STJ. While, more PV intrusions result from this weaker STJ on its equatorward side; significantly increase the stratosphere-troposphere exchange processes on the longer timescale. This plays an important role in determining the atmospheric composition, particularly of tropospheric ozone, in the northern outer tropical central Pacific. It may lead to more ozone of stratospheric origin in the LT and even in the marine boundary, which may act as harmful pollutants and affect the radiative processes by changing the global budgets of atmospheric hydroxyl radicals.Keywords: PV intrusion, westerly duct, ozone, Central Pacific
Procedia PDF Downloads 2382596 Exploring Counting Methods for the Vertices of Certain Polyhedra with Uncertainties
Authors: Sammani Danwawu Abdullahi
Abstract:
Vertex Enumeration Algorithms explore the methods and procedures of generating the vertices of general polyhedra formed by system of equations or inequalities. These problems of enumerating the extreme points (vertices) of general polyhedra are shown to be NP-Hard. This lead to exploring how to count the vertices of general polyhedra without listing them. This is also shown to be #P-Complete. Some fully polynomial randomized approximation schemes (fpras) of counting the vertices of some special classes of polyhedra associated with Down-Sets, Independent Sets, 2-Knapsack problems and 2 x n transportation problems are presented together with some discovered open problems.Keywords: counting with uncertainties, mathematical programming, optimization, vertex enumeration
Procedia PDF Downloads 3572595 A Corporate Social Responsibility View on Bribery Control in Business Relationships
Authors: Irfan Ameer
Abstract:
Bribery control in developing countries is the biggest challenge for multinational enterprises (MNEs). Bribery practices are socially embedded and institutionalized, and therefore may achieve collective legitimacy in the society. MNEs often have better and strict norms, codes and standards about such corrupt practices. Bribery in B2B sales relationships has been researched but studies focusing on the role of firm in controlling bribery are scarce. The main objective of this paper is to explore MNEs strategies to control bribery in an environment where bribery is institutionalized. This qualitative study uses narrative approach and focuses on key events, actors and their role in controlling bribery in B2B sales relationships. The context of this study is pharmaceutical industry of Pakistan and data is collected through 23 episodic interviews supported by secondary data. The Corporate social responsibility (CSR) literature e.g. CSR three domain model and CSR pyramid is used to make sense of MNEs strategies to control bribery in developing countries. Results show that MNEs’ bribery control strategies are rather emerging based on the role of some key stakeholders and events which shape bribery strategies. Five key bribery control strategies were found through which MNEs can control both demand and supply side of bribery: bribery related codes development; bribery related codes implementation; focusing on competitive advantage; find mutually beneficial ethical solution; and collaboration with ethical stakeholders. The results also highlight the problems associated with each strategy. Study is unique in a sense that it focuses on stakeholders having unethical interests and provides guidelines to MNEs in controlling bribery practices in B2B sales relationships.Keywords: bribery, developing countries, CSR, narrative research, B2B sales, MNEs
Procedia PDF Downloads 3742594 Fake Accounts Detection in Twitter Based on Minimum Weighted Feature Set
Authors: Ahmed ElAzab, Amira M. Idrees, Mahmoud A. Mahmoud, Hesham Hefny
Abstract:
Social networking sites such as Twitter and Facebook attracts over 500 million users across the world, for those users, their social life, even their practical life, has become interrelated. Their interaction with social networking has affected their life forever. Accordingly, social networking sites have become among the main channels that are responsible for vast dissemination of different kinds of information during real time events. This popularity in Social networking has led to different problems including the possibility of exposing incorrect information to their users through fake accounts which results to the spread of malicious content during life events. This situation can result to a huge damage in the real world to the society in general including citizens, business entities, and others. In this paper, we present a classification method for detecting fake accounts on Twitter. The study determines the minimized set of the main factors that influence the detection of the fake accounts on Twitter, then the determined factors have been applied using different classification techniques, a comparison of the results for these techniques has been performed and the most accurate algorithm is selected according to the accuracy of the results. The study has been compared with different recent research in the same area, this comparison has proved the accuracy of the proposed study. We claim that this study can be continuously applied on Twitter social network to automatically detect the fake accounts, moreover, the study can be applied on different Social network sites such as Facebook with minor changes according to the nature of the social network which are discussed in this paper.Keywords: fake accounts detection, classification algorithms, twitter accounts analysis, features based techniques
Procedia PDF Downloads 4162593 Two-Sided Information Dissemination in Takeovers: Disclosure and Media
Authors: Eda Orhun
Abstract:
Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success
Procedia PDF Downloads 3172592 Natural Law in the Mu’Tazilite Theology
Authors: Samaneh Khalili
Abstract:
Natural law theory, in moral philosophy, refers to a system of unchanging values held to be mutual to all humans and can be discovered through reason. The natural law theory is commonly associated with western Philosophers. In contrast, discussions on notions of natural law in Islamic intellectual history were relatively rare. This paper aims to show that the moral theory developed by the Mu'tazilite thinkers can be classified in the ideas of natural law. In doing so, this study will demonstrate that the objective and unchanging values, according to Mu'tazilite theologians, provide the guidelines for assessing the Islamic law rules in the field of human coexistence. The focus of the paper lies on ʿAbd al-Ğabbār, who was the most influential thinker in the late epoch of the Muʿtazila. Although ʿAbd al-Ǧabbār did not leave a text with a systematic discussion of natural law, his teaching of nature, human reason, and the moral values of actions are all scattered throughout his work -'al-Muġnī fī abwāb at-tawḥīd wa-l-'adl'. It is necessary to focus on ʿAbd al-Ǧabbār's theories on reason, nature, and ethics since natural law revolves around the basic concepts of nature, reason, and moral value. While analyzing the concept of the Nature, it will attempt to answer how he explains the world's physical structure and God's relationship to natural events. Moreover, from ʿAbd al-Ǧabbār's point of view, is nature a self-determined system that follows its inner principle in every kind of change, or is nature guided by an external power? Does causality govern natural events? About the concept of reason, an attempt is made to examine how human reason, according to ʿAbd al-Ǧabbār, conceives moral attributes. Finally, the Autor will discuss the concepts of objective values and the place of rights and duties derived from Islamic law in ʿAbd al-Ǧabbār's thought.Keywords: Islamic law, Mu'tazilite theology, natural law in Islamic theology, objective and unchanging values.
Procedia PDF Downloads 972591 Employing GIS to Analyze Areas Prone to Flooding: Case Study of Thailand
Authors: Sanpachai Huvanandana, Settapong Malisuwan, Soparwan Tongyuak, Prust Pannachet, Anong Phoepueak, Navneet Madan
Abstract:
Many regions of Thailand are prone to flooding due to tropical climate. A commonly increasing precipitation in this continent results in risk of flooding. Many efforts have been implemented such as drainage control system, multiple dams, and irrigation canals. In order to decide where the drainages, dams, and canal should be appropriately located, the flooding risk area should be determined. This paper is aimed to identify the appropriate features that can be used to classify the flooding risk area in Thailand. Several features have been analyzed and used to classify the area. Non-supervised clustering techniques have been used and the results have been compared with ten years average actual flooding area.Keywords: flood area clustering, geographical information system, flood features
Procedia PDF Downloads 2952590 An Exploratory Study on the Impact of Climate Change on Design Rainfalls in the State of Qatar
Authors: Abdullah Al Mamoon, Niels E. Joergensen, Ataur Rahman, Hassan Qasem
Abstract:
Intergovernmental Panel for Climate Change (IPCC) in its fourth Assessment Report AR4 predicts a more extreme climate towards the end of the century, which is likely to impact the design of engineering infrastructure projects with a long design life. A recent study in 2013 developed new design rainfall for Qatar, which provides an improved design basis of drainage infrastructure for the State of Qatar under the current climate. The current design standards in Qatar do not consider increased rainfall intensity caused by climate change. The focus of this paper is to update recently developed design rainfalls in Qatar under the changing climatic conditions based on IPCC's AR4 allowing a later revision to the proposed design standards, relevant for projects with a longer design life. The future climate has been investigated based on the climate models released by IPCC’s AR4 and A2 story line of emission scenarios (SRES) using a stationary approach. Annual maximum series (AMS) of predicted 24 hours rainfall data for both wet (NCAR-CCSM) scenario and dry (CSIRO-MK3.5) scenario for the Qatari grid points in the climate models have been extracted for three periods, current climate 2010-2039, medium term climate (2040-2069) and end of century climate (2070-2099). A homogeneous region of the Qatari grid points has been formed and L-Moments based regional frequency approach is adopted to derive design rainfalls. The results indicate no significant changes in the design rainfall on the short term 2040-2069, but significant changes are expected towards the end of the century (2070-2099). New design rainfalls have been developed taking into account climate change for 2070-2099 scenario and by averaging results from the two scenarios. IPCC’s AR4 predicts that the rainfall intensity for a 5-year return period rain with duration of 1 to 2 hours will increase by 11% in 2070-2099 compared to current climate. Similarly, the rainfall intensity for more extreme rainfall, with a return period of 100 years and duration of 1 to 2 hours will increase by 71% in 2070-2099 compared to current climate. Infrastructure with a design life exceeding 60 years should add safety factors taking the predicted effects from climate change into due consideration.Keywords: climate change, design rainfalls, IDF, Qatar
Procedia PDF Downloads 3932589 Rupture Termination of the 1950 C. E. Earthquake and Recurrent Interval of Great Earthquake in North Eastern Himalaya, India
Authors: Rao Singh Priyanka, Jayangondaperumal R.
Abstract:
The Himalayan active fault has the potential to generate great earthquakes in the future, posing a biggest existential threat to humans in the Himalayan and adjacent region. Quantitative evaluation of accumulated and released interseismic strain is crucial to assess the magnitude and spatio-temporal variability of future great earthquakes along the Himalayan arc. To mitigate the destruction and hazards associated with such earthquakes, it is important to understand their recurrence cycle. The eastern Himalayan and Indo-Burman plate boundary systems offers an oblique convergence across two orthogonal plate boundaries, resulting in a zone of distributed deformation both within and away from the plate boundary and clockwise rotation of fault-bounded blocks. This seismically active region has poorly documented historical archive of the past large earthquakes. Thus, paleoseismologicalstudies confirm the surface rupture evidences of the great continental earthquakes (Mw ≥ 8) along the Himalayan Frontal Thrust (HFT), which along with the Geodetic studies, collectively provide the crucial information to understand and assess the seismic potential. These investigations reveal the rupture of 3/4th of the HFT during great events since medieval time but with debatable opinions for the timing of events due to unclear evidences, ignorance of transverse segment boundaries, and lack of detail studies. Recent paleoseismological investigations in the eastern Himalaya and Mishmi ranges confirms the primary surface ruptures of the 1950 C.E. great earthquake (M>8). However, a seismic gap exists between the 1714 C.E. and 1950 C.E. Assam earthquakes that did not slip since 1697 C.E. event. Unlike the latest large blind 2015 Gorkha earthquake (Mw 7.8), the 1950 C.E. event is not triggered by a large event of 1947 C.E. that occurred near the western edge of the great upper Assam event. Moreover, the western segment of the eastern Himalayadid not witness any surface breaking earthquake along the HFT for over the past 300 yr. The frontal fault excavations reveal that during the 1950 earthquake, ~3.1-m-high scarp along the HFT was formed due to the co-seismic slip of 5.5 ± 0.7 m at Pasighat in the Eastern Himalaya and a 10-m-high-scarp at a Kamlang Nagar along the Mishmi Thrust in the Eastern Himalayan Syntaxis is an outcome of a dip-slip displacement of 24.6 ± 4.6 m along a 25 ± 5°E dipping fault. This event has ruptured along the two orthogonal fault systems in the form of oblique thrust fault mechanism. Approx. 130 km west of Pasighat site, the Himebasti village has witnessed two earthquakes, the historical 1697 Sadiya earthquake, and the 1950 event, with a cumulative dip-slip displacement of 15.32 ± 4.69 m. At Niglok site, Arunachal Pradesh, a cumulative slip of ~12.82 m during at least three events since pre 19585 B.P. has produced ~6.2-m high scarp while the youngest scarp of ~2.4-m height has been produced during 1697 C.E. The site preserves two deformational events along the eastern HFT, providing an idea of last serial ruptures at an interval of ~850 yearswhile the successive surface rupturing earthquakes lacks in the Mishmi Range to estimate the recurrence cycle.Keywords: paleoseismology, surface rupture, recurrence interval, Eastern Himalaya
Procedia PDF Downloads 842588 Study of the Impact of Synthesis Method and Chemical Composition on Photocatalytic Properties of Cobalt Ferrite Catalysts
Authors: Katerina Zaharieva, Vicente Rives, Martin Tsvetkov, Raquel Trujillano, Boris Kunev, Ivan Mitov, Maria Milanova, Zara Cherkezova-Zheleva
Abstract:
The nanostructured cobalt ferrite-type materials Sample A - Co0.25Fe2.75O4, Sample B - Co0.5Fe2.5O4, and Sample C - CoFe2O4 were prepared by co-precipitation in our previous investigations. The co-precipitated Sample B and Sample C were mechanochemically activated in order to produce Sample D - Co0.5Fe2.5O4 and Sample E- CoFe2O4. The PXRD, Moessbauer and FTIR spectroscopies, specific surface area determination by the BET method, thermal analysis, element chemical analysis and temperature-programmed reduction were used to investigate the prepared nano-sized samples. The changes of the Malachite green dye concentration during reaction of the photocatalytic decolorization using nanostructured cobalt ferrite-type catalysts with different chemical composition are included. The photocatalytic results show that the increase in the degree of incorporation of cobalt ions in the magnetite host structure for co-precipitated cobalt ferrite-type samples results in an increase of the photocatalytic activity: Sample A (4 х10-3 min-1) < Sample B (5 х10-3 min-1) < Sample C (7 х10-3 min-1). Mechanochemically activated photocatalysts showed a higher activity than the co-precipitated ferrite materials: Sample D (16 х10-3 min-1) > Sample E (14 х10-3 min-1) > Sample C (7 х10-3 min-1) > Sample B (5 х10-3 min-1) > Sample A (4 х10-3 min-1). On decreasing the degree of substitution of iron ions by cobalt ones a higher sorption ability of the dye after the dark period for the co-precipitated cobalt ferrite materials was observed: Sample C (72 %) < Sample B (78 %) < Sample A (80 %). Mechanochemically treated ferrite catalysts and co-precipitated Sample B possess similar sorption capacities, Sample D (78 %) ~ Sample E (78 %) ~ Sample B (78 %). The prepared nano-sized cobalt ferrite-type materials demonstrate good photocatalytic and sorption properties. Mechanochemically activated Sample D - Co0.5Fe2.5O4 (16х10-3 min-1) and Sample E-CoFe2O4 (14х10-3 min-1) possess higher photocatalytic activity than that of the most common used UV-light catalyst Degussa P25 (12х10-3 min-1). The dependence of the photo-catalytic activity and sorption properties on the preparation method and different degree of substitution of iron ions by cobalt ions in synthesized cobalt ferrite samples is established. The mechanochemical activation leads to formation of nano-structured cobalt ferrite-type catalysts (Sample D and Sample E) with higher rate constants than those of the ferrite materials (Sample A, Sample B, and Sample C) prepared by the co-precipitation procedure. The increase in the degree of substitution of iron ions by cobalt ones leads to improved photocatalytic properties and lower sorption capacities of the co-precipitated ferrite samples. The good sorption properties between 72 and 80% of the prepared ferrite-type materials show that they could be used as potential cheap absorbents for purification of polluted waters.Keywords: nanodimensional cobalt ferrites, photocatalyst, synthesis, mechanochemical activation
Procedia PDF Downloads 2642587 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making
Authors: Ayham Fattoum, Simos Chari, Duncan Shaw
Abstract:
Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.Keywords: Intuition, complexity management, decision-making, viable system model
Procedia PDF Downloads 672586 Epigenetic and Archeology: A Quest to Re-Read Humanity
Authors: Salma A. Mahmoud
Abstract:
Epigenetic, or alteration in gene expression influenced by extragenetic factors, has emerged as one of the most promising areas that will address some of the gaps in our current knowledge in understanding patterns of human variation. In the last decade, the research investigating epigenetic mechanisms in many fields has flourished and witnessed significant progress. It paved the way for a new era of integrated research especially between anthropology/archeology and life sciences. Skeletal remains are considered the most significant source of information for studying human variations across history, and by utilizing these valuable remains, we can interpret the past events, cultures and populations. In addition to archeological, historical and anthropological importance, studying bones has great implications in other fields such as medicine and science. Bones also can hold within them the secrets of the future as they can act as predictive tools for health, society characteristics and dietary requirements. Bones in their basic forms are composed of cells (osteocytes) that are affected by both genetic and environmental factors, which can only explain a small part of their variability. The primary objective of this project is to examine the epigenetic landscape/signature within bones of archeological remains as a novel marker that could reveal new ways to conceptualize chronological events, gender differences, social status and ecological variations. We attempted here to address discrepancies in common variants such as methylome as well as novel epigenetic regulators such as chromatin remodelers, which to our best knowledge have not yet been investigated by anthropologists/ paleoepigenetists using plethora of techniques (biological, computational, and statistical). Moreover, extracting epigenetic information from bones will highlight the importance of osseous material as a vector to study human beings in several contexts (social, cultural and environmental), and strengthen their essential role as model systems that can be used to investigate and construct various cultural, political and economic events. We also address all steps required to plan and conduct an epigenetic analysis from bone materials (modern and ancient) as well as discussing the key challenges facing researchers aiming to investigate this field. In conclusion, this project will serve as a primer for bioarcheologists/anthropologists and human biologists interested in incorporating epigenetic data into their research programs. Understanding the roles of epigenetic mechanisms in bone structure and function will be very helpful for a better comprehension of their biology and highlighting their essentiality as interdisciplinary vectors and a key material in archeological research.Keywords: epigenetics, archeology, bones, chromatin, methylome
Procedia PDF Downloads 1082585 Comparison of Extracellular miRNA from Different Lymphocyte Cell Lines and Isolation Methods
Authors: Christelle E. Chua, Alicia L. Ho
Abstract:
The development of a panel of differential gene expression signatures has been of interest in the field of biomarker discovery for radiation exposure. In the absence of the availability of exposed human subjects, lymphocyte cell lines have often been used as a surrogate to human whole blood, when performing ex vivo irradiation studies. The extent of variation between different lymphocyte cell lines is currently unclear, especially with regard to the expression of extracellular miRNA. This study compares the expression profile of extracellular miRNA isolated from different lymphocyte cell lines. It also compares the profile of miRNA obtained when different exosome isolation kits are used. Lymphocyte cell lines were created using lymphocytes isolated from healthy adult males of similar racial descent (Chinese American and Chinese Singaporean) and immortalised with Epstein-Barr virus. The cell lines were cultured in exosome-free cell culture media for 72h and the cell culture supernatant was removed for exosome isolation. Two exosome isolation kits were used. Total exosome isolation reagent (TEIR, ThermoFisher) is a polyethylene glycol (PEG)-based exosome precipitation kit, while ExoSpin (ES, Cell Guidance Systems) is a PEG-based exosome precipitation kit that includes an additional size exclusion chromatography step. miRNA from the isolated exosomes were isolated using miRNEASY minikit (Qiagen) and analysed using nCounter miRNA assay (Nanostring). Principal component analysis (PCA) results suggested that the overall extracellular miRNA expression profile differed between the lymphocyte cell line originating from the Chinese American donor and the cell line originating from the Chinese Singaporean donor. As the gender, age and racial origins of both donors are similar, this may suggest that there are other genetic or epigenetic differences that account for the variation in extracellular miRNA gene expression in lymphocyte cell lines. However, statistical analysis showed that only 3 miRNA genes had a fold difference > 2 at p < 0.05, suggesting that the differences may not be of that great a significance as to impact overall conclusions drawn from different cell lines. Subsequent analysis using cell lines from other donors will give further insight into the reproducibility of results when difference cell lines are used. PCA results also suggested that the method of exosome isolation impacted the expression profile. 107 miRNA had a fold difference > 2 at p < 0.05. This suggests that the inclusion of an additional size exclusion chromatography step altered the subset of the extracellular vesicles that were isolated. In conclusion, these results suggest that extracellular miRNA can be isolated and analysed from exosomes derived from lymphocyte cell lines. However, care must be taken in the choice of cell line and method of exosome isolation used.Keywords: biomarker, extracellular miRNA, isolation methods, lymphocyte cell line
Procedia PDF Downloads 1992584 Recycling of Sintered NdFeB Magnet Waste Via Oxidative Roasting and Selective Leaching
Authors: W. Kritsarikan, T. Patcharawit, T. Yingnakorn, S. Khumkoa
Abstract:
Neodymium-iron-boron (NdFeB) magnets classified as high-power magnets are widely used in various applications such as electrical and medical devices and account for 13.5 % of the permanent magnet’s market. Since its typical composition of 29 - 32 % Nd, 64.2 – 68.5 % Fe and 1 – 1.2 % B contains a significant amount of rare earth metals and will be subjected to shortages in the future. Domestic NdFeB magnet waste recycling should therefore be developed in order to reduce social, environmental impacts toward a circular economy. Most research works focus on recycling the magnet wastes, both from the manufacturing process and end of life. Each type of wastes has different characteristics and compositions. As a result, these directly affect recycling efficiency as well as the types and purity of the recyclable products. This research, therefore, focused on the recycling of manufacturing NdFeB magnet waste obtained from the sintering stage of magnet production and the waste contained 23.6% Nd, 60.3% Fe and 0.261% B in order to recover high purity neodymium oxide (Nd₂O₃) using hybrid metallurgical process via oxidative roasting and selective leaching techniques. The sintered NdFeB waste was first ground to under 70 mesh prior to oxidative roasting at 550 - 800 °C to enable selective leaching of neodymium in the subsequent leaching step using H₂SO₄ at 2.5 M over 24 h. The leachate was then subjected to drying and roasting at 700 – 800 °C prior to precipitation by oxalic acid and calcination to obtain neodymium oxide as the recycling product. According to XRD analyses, it was found that increasing oxidative roasting temperature led to an increasing amount of hematite (Fe₂O₃) as the main composition with a smaller amount of magnetite (Fe₃O₄) found. Peaks of neodymium oxide (Nd₂O₃) were also observed in a lesser amount. Furthermore, neodymium iron oxide (NdFeO₃) was present and its XRD peaks were pronounced at higher oxidative roasting temperatures. When proceeded to acid leaching and drying, iron sulfate and neodymium sulfate were mainly obtained. After the roasting step prior to water leaching, iron sulfate was converted to form hematite as the main compound, while neodymium sulfate remained in the ingredient. However, a small amount of magnetite was still detected by XRD. The higher roasting temperature at 800 °C resulted in a greater Fe₂O₃ to Nd₂(SO₄)₃ ratio, indicating a more effective roasting temperature. Iron oxides were subsequently water leached and filtered out while the solution contained mainly neodymium sulfate. Therefore, low oxidative roasting temperature not exceeding 600 °C followed by acid leaching and roasting at 800 °C gave the optimum condition for further steps of precipitation and calcination to finally achieve neodymium oxide.Keywords: NdFeB magnet waste, oxidative roasting, recycling, selective leaching
Procedia PDF Downloads 1822583 The Development of an Anaesthetic Crisis Manual for Acute Critical Events: A Pilot Study
Authors: Jacklyn Yek, Clara Tong, Shin Yuet Chong, Yee Yian Ong
Abstract:
Background: While emergency manuals and cognitive aids (CA) have been used in high-hazard industries for decades, this has been a nascent field in healthcare. CAs can potentially offset the large cognitive load involved in crisis resource management and possibly facilitate the efficient performance of key steps in treatment. A crisis manual was developed based on local guidelines and the latest evidence-based information and introduced to a tertiary hospital setting in Singapore. Hence, the objective of this study is to evaluate the effectiveness of the crisis manual in guiding response and management of critical events. Methods: 7 surgical teams were recruited to participate in a series of simulated emergencies in high-fidelity operating room simulator over the period of April to June 2018. All teams consisted of a surgical consultant and medical officer/registrar, anesthesia consultant and medical officer/registrar; as well as a circulating, scrub and anesthetic nurse. Each team performed a simulated operation in which 1 or more of the crisis events occurred. The teams were randomly assigned to a scenario of the crisis manual and all teams were deemed to be equal in experience and knowledge. Before the simulation, teams were instructed on proper checklist use but the use of the checklist was optional. Results: 7 simulation sessions were performed, consisting of the following scenarios: Airway fire, Massive Transfusion Protocol, Malignant Hyperthermia, Eclampsia, and Difficult Airway. Out of the 7 surgical teams, 2 teams made use of the crisis manual – of which both teams had encountered a ‘Malignant Hyperthermia’ scenario. These team members reflected that the crisis manual assisted allowed them to work in a team, especially being able to involve the surgical doctors who were unfamiliar with the condition and management. A run chart plotted showed a possible upward trend, suggesting that with increasing awareness and training, staff would become more likely to initiate the use of the crisis manual. Conclusion: Despite the high volume load in this tertiary hospital, certain crises remain rare and clinicians are often caught unprepared. A crisis manual is an effective tool and easy-to-use repository that can improve patient outcome and encourage teamwork. With training, familiarity would allow clinicians to be increasingly comfortable with reaching out for the crisis manual. More simulation training would need to be conducted to determine its effectiveness.Keywords: crisis resource management, high fidelity simulation training, medical errors, visual aids
Procedia PDF Downloads 1272582 Assessing Prescribed Burn Severity in the Wetlands of the Paraná River -Argentina
Authors: Virginia Venturini, Elisabet Walker, Aylen Carrasco-Millan
Abstract:
Latin America stands at the front of climate change impacts, with forecasts projecting accelerated temperature and sea level rises compared to the global average. These changes are set to trigger a cascade of effects, including coastal retreat, intensified droughts in some nations, and heightened flood risks in others. In Argentina, wildfires historically affected forests, but since 2004, wetland fires have emerged as a pressing concern. By 2021, the wetlands of the Paraná River faced a dangerous situation. In fact, during the year 2021, a high-risk scenario was naturally formed in the wetlands of the Paraná River, in Argentina. Very low water levels in the rivers, and excessive standing dead plant material (fuel), triggered most of the fires recorded in the vast wetland region of the Paraná during 2020-2021. During 2008 fire events devastated nearly 15% of the Paraná Delta, and by late 2021 new fires burned more than 300,000 ha of these same wetlands. Therefore, the goal of this work is to explore remote sensing tools to monitor environmental conditions and the severity of prescribed burns in the Paraná River wetlands. Thus, two prescribed burning experiments were carried out in the study area (31°40’ 05’’ S, 60° 34’ 40’’ W) during September 2023. The first experiment was carried out on Sept. 13th, in a plot of 0.5 ha which dominant vegetation were Echinochloa sp., and Thalia, while the second trial was done on Sept 29th in a plot of 0.7 ha, next to the first burned parcel; here the dominant vegetation species were Echinochloa sp. and Solanum glaucophyllum. Field campaigns were conducted between September 8th and November 8th to assess the severity of the prescribed burns. Flight surveys were conducted utilizing a DJI® Inspire II drone equipped with a Sentera® NDVI camera. Then, burn severity was quantified by analyzing images captured by the Sentera camera along with data from the Sentinel 2 satellite mission. This involved subtracting the NDVI images obtained before and after the burn experiments. The results from both data sources demonstrate a highly heterogeneous impact of fire within the patch. Mean severity values obtained with drone NDVI images of the first experience were about 0.16 and 0.18 with Sentinel images. For the second experiment, mean values obtained with the drone were approximately 0.17 and 0.16 with Sentinel images. Thus, most of the pixels showed low fire severity and only a few pixels presented moderated burn severity, based on the wildfire scale. The undisturbed plots maintained consistent mean NDVI values throughout the experiments. Moreover, the severity assessment of each experiment revealed that the vegetation was not completely dry, despite experiencing extreme drought conditions.Keywords: prescribed-burn, severity, NDVI, wetlands
Procedia PDF Downloads 672581 Recycling of Sintered Neodymium-Iron-Boron (NdFeB) Magnet Waste via Oxidative Roasting and Selective Leaching
Authors: Woranittha Kritsarikan
Abstract:
Neodymium-iron-boron (NdFeB) magnets classified as high-power magnets are widely used in various applications such as electrical and medical devices and account for 13.5 % of the permanent magnet’s market. Since its typical composition of 29 - 32 % Nd, 64.2 – 68.5 % Fe and 1 – 1.2 % B contains a significant amount of rare earth metals and will be subjected to shortages in the future. Domestic NdFeB magnet waste recycling should therefore be developed in order to reduce social, environmental impacts toward the circular economy. Most research works focus on recycling the magnet wastes, both from the manufacturing process and end of life. Each type of wastes has different characteristics and compositions. As a result, these directly affect recycling efficiency as well as the types and purity of the recyclable products. This research, therefore, focused on the recycling of manufacturing NdFeB magnet waste obtained from the sintering stage of magnet production and the waste contained 23.6% Nd, 60.3% Fe and 0.261% B in order to recover high purity neodymium oxide (Nd₂O₃) using hybrid metallurgical process via oxidative roasting and selective leaching techniques. The sintered NdFeB waste was first ground to under 70 mesh prior to oxidative roasting at 550 - 800 ᵒC to enable selective leaching of neodymium in the subsequent leaching step using H₂SO₄ at 2.5 M over 24 hours. The leachate was then subjected to drying and roasting at 700 – 800 ᵒC prior to precipitation by oxalic acid and calcination to obtain neodymium oxide as the recycling product. According to XRD analyses, it was found that increasing oxidative roasting temperature led to the increasing amount of hematite (Fe₂O₃) as the main composition with a smaller amount of magnetite (Fe3O4) found. Peaks of neodymium oxide (Nd₂O₃) were also observed in a lesser amount. Furthermore, neodymium iron oxide (NdFeO₃) was present and its XRD peaks were pronounced at higher oxidative roasting temperature. When proceeded to acid leaching and drying, iron sulfate and neodymium sulfate were mainly obtained. After the roasting step prior to water leaching, iron sulfate was converted to form hematite as the main compound, while neodymium sulfate remained in the ingredient. However, a small amount of magnetite was still detected by XRD. The higher roasting temperature at 800 ᵒC resulted in a greater Fe2O3 to Nd2(SO4)3 ratio, indicating a more effective roasting temperature. Iron oxides were subsequently water leached and filtered out while the solution contained mainly neodymium sulfate. Therefore, low oxidative roasting temperature not exceeding 600 ᵒC followed by acid leaching and roasting at 800 ᵒC gave the optimum condition for further steps of precipitation and calcination to finally achieve neodymium oxide.Keywords: NdFeB magnet waste, oxidative roasting, recycling, selective leaching
Procedia PDF Downloads 1772580 A Comparative Case Study of the Impact of Square and Yurt-Shape Buildings on Energy Efficiency
Authors: Valeriya Tyo, Serikbolat Yessengabulov
Abstract:
Regions with extreme climate conditions such as Astana city require energy saving measures to increase the energy performance of buildings which are responsible for more than 40% of total energy consumption. Identification of optimal building geometry is one of the key factors to be considered. The architectural form of a building has the impact on space heating and cooling energy use, however, the interrelationship between the geometry and resultant energy use is not always readily apparent. This paper presents a comparative case study of two prototypical buildings with compact building shape to assess its impact on energy performance.Keywords: building geometry, energy efficiency, heat gain, heat loss
Procedia PDF Downloads 4992579 Sweepline Algorithm for Voronoi Diagram of Polygonal Sites
Authors: Dmitry A. Koptelov, Leonid M. Mestetskiy
Abstract:
Voronoi Diagram (VD) of finite set of disjoint simple polygons, called sites, is a partition of plane into loci (for each site at the locus) – regions, consisting of points that are closer to a given site than to all other. Set of polygons is a universal model for many applications in engineering, geoinformatics, design, computer vision, and graphics. VD of polygons construction usually done with a reduction to task of constructing VD of segments, for which there are effective O(n log n) algorithms for n segments. Preprocessing – constructing segments from polygons’ sides, and postprocessing – polygon’s loci construction by merging the loci of the sides of each polygon are also included in reduction. This approach doesn’t take into account two specific properties of the resulting segment sites. Firstly, all this segments are connected in pairs in the vertices of the polygons. Secondly, on the one side of each segment lies the interior of the polygon. The polygon is obviously included in its locus. Using this properties in the algorithm for VD construction is a resource to reduce computations. The article proposes an algorithm for the direct construction of VD of polygonal sites. Algorithm is based on sweepline paradigm, allowing to effectively take into account these properties. The solution is performed based on reduction. Preprocessing is the constructing of set of sites from vertices and edges of polygons. Each site has an orientation such that the interior of the polygon lies to the left of it. Proposed algorithm constructs VD for set of oriented sites with sweepline paradigm. Postprocessing is a selecting of edges of this VD formed by the centers of empty circles touching different polygons. Improving the efficiency of the proposed sweepline algorithm in comparison with the general Fortune algorithm is achieved due to the following fundamental solutions: 1. Algorithm constructs only such VD edges, which are on the outside of polygons. Concept of oriented sites allowed to avoid construction of VD edges located inside the polygons. 2. The list of events in sweepline algorithm has a special property: the majority of events are connected with “medium” polygon vertices, where one incident polygon side lies behind the sweepline and the other in front of it. The proposed algorithm processes such events in constant time and not in logarithmic time, as in the general Fortune algorithm. The proposed algorithm is fully implemented and tested on a large number of examples. The high reliability and efficiency of the algorithm is also confirmed by computational experiments with complex sets of several thousand polygons. It should be noted that, despite the considerable time that has passed since the publication of Fortune's algorithm in 1986, a full-scale implementation of this algorithm for an arbitrary set of segment sites has not been made. The proposed algorithm fills this gap for an important special case - a set of sites formed by polygons.Keywords: voronoi diagram, sweepline, polygon sites, fortunes' algorithm, segment sites
Procedia PDF Downloads 1772578 Efficacy and Safety of Probiotic Treatment in Patients with Liver Cirrhosis: A Systematic Review and Meta-Analysis
Authors: Samir Malhotra, Rajan K. Khandotra, Rakesh K. Dhiman, Neelam Chadha
Abstract:
There is paucity of data about safety and efficacy of probiotic treatment on patient outcomes in cirrhosis. Specifically, it is important to know whether probiotics can improve mortality, hepatic encephalopathy (HE), number of hospitalizations, ammonia levels, quality of life, and adverse events. Probiotics may improve outcomes in patients with acute or chronic HE. However, it is also important to know whether probiotics can prevent development of HE, even in situations where patients do not have acute HE at the time of administration. It is also important to know if probiotics are useful as primary prophylaxis of HE. We aimed to conduct an updated systematic review and meta-analysis to evaluate the safety and efficacy of probiotics in patients with cirrhosis. We searched PubMed, Cochrane library, Embase, Scopus, SCI, Google Scholar, conference proceedings, and references of included studies till June 2017 to identify randomised clinical trials comparing probiotics with other treatments in cirrhotics. Data was analyzed using MedCalc. Probiotics had no effect on mortality but significantly reduced HE (14 trials, 1073 patients, OR 0.371; 95% CI 0.282 to 0.489). There was not enough data to conduct a meta-analysis on outcomes like hospitalizations and quality of life. The effect on plasma ammonia levels was not significant (SMD -0.429; 95%CI -1.034 – 0.177). There was no difference in adverse events. To conclude, although the included studies had a high risk of bias, the available evidence does suggest a beneficial effect on HE. Larger studies with longer periods of follow-up are needed to determine if probiotics can reduce all-cause mortality.Keywords: cirrhosis, hepatic encephalopathy, meta-analysis, probiotic
Procedia PDF Downloads 2012577 Molecular Dynamics Simulations on Richtmyer-Meshkov Instability of Li-H2 Interface at Ultra High-Speed Shock Loads
Authors: Weirong Wang, Shenghong Huang, Xisheng Luo, Zhenyu Li
Abstract:
Material mixing process and related dynamic issues at extreme compressing conditions have gained more and more concerns in last ten years because of the engineering appealings in inertial confinement fusion (ICF) and hypervelocity aircraft developments. However, there lacks models and methods that can handle fully coupled turbulent material mixing and complex fluid evolution under conditions of high energy density regime up to now. In aspects of macro hydrodynamics, three numerical methods such as direct numerical simulation (DNS), large eddy simulation (LES) and Reynolds-averaged Navier–Stokes equations (RANS) has obtained relative acceptable consensus under the conditions of low energy density regime. However, under the conditions of high energy density regime, they can not be applied directly due to occurrence of dissociation, ionization, dramatic change of equation of state, thermodynamic properties etc., which may make the governing equations invalid in some coupled situations. However, in view of micro/meso scale regime, the methods based on Molecular Dynamics (MD) as well as Monte Carlo (MC) model are proved to be promising and effective ways to investigate such issues. In this study, both classical MD and first-principle based electron force field MD (eFF-MD) methods are applied to investigate Richtmyer-Meshkov Instability of metal Lithium and gas Hydrogen (Li-H2) interface mixing at different shock loading speed ranging from 3 km/s to 30 km/s. It is found that: 1) Classical MD method based on predefined potential functions has some limits in application to extreme conditions, since it cannot simulate the ionization process and its potential functions are not suitable to all conditions, while the eFF-MD method can correctly simulate the ionization process due to its ‘ab initio’ feature; 2) Due to computational cost, the eFF-MD results are also influenced by simulation domain dimensions, boundary conditions and relaxation time choices, etc., in computations. Series of tests have been conducted to determine the optimized parameters. 3) Ionization induced by strong shock compression has important effects on Li-H2 interface evolutions of RMI, indicating a new micromechanism of RMI under conditions of high energy density regime.Keywords: first-principle, ionization, molecular dynamics, material mixture, Richtmyer-Meshkov instability
Procedia PDF Downloads 2252576 Quality of Care of Medical Male Circumcisions: A Non-Negotiable for Right to Care
Authors: Nelson Igaba, C. Onaga, S. Hlongwane
Abstract:
Background: Medical Male Circumcision (MMC) is part of a comprehensive HIV prevention strategy. The quality of MMC done at Right To Care (RtC) sites is maintained by Continuous Quality Improvement (CQI) based on findings of assessments by internal and independent external assessors who evaluate such parameters as the quality of the surgical procedure, infection control, etc. There are 12 RtC MMC teams in Mpumalanga, two of which are headed by Medical Officers and 10 by Clinical Associates (Clin A). Objectives: To compare the quality (i) of care rendered at doctor headed sites (DHS) versus Clin A headed sites (CHS); (ii) of CQI assessments (external versus internal). Methodology: A retrospective review of data from RightMax™ (a novel RtC data management system) and CQI reports (external and internal) was done. CQI assessment scores of October 2015 and October 2016 were taken as the baseline and latest respectively. Four sites with 745-810 circumcisions per annum were purposively selected; the two DHS (group A) and two CHS (group B). Statistical analyses were conducted using R (2017 version). Results: There were no significant difference in latest CQI scores between the two groups (DHS and CHS) (Anova, F = 1.97, df = 1, P = 0.165); between internal and external CQI assessment scores (Anova, F = 2.251, df = 1, P = 0.139) or among the individual sites (Anova, F = 1.095, df = 2, P = 0.341). Of the total of 16 adverse events reported by the four sites in the 12 months reviewed (all were infections), there was no statistical evidence that the documented severity of the infection was different for DHS and CHS (Fisher’s exact test, p-value = 0.269). Conclusion: At RtC VMMC sites in Mpumalanga, internal and external/independent CQI assessments are comparable, and quality of care of VMMC is standardized with the performance of well-supervised clinical associates comparing well with those of medical officers.Keywords: adverse events, Right to Care, male medical circumcision, continuous quality improvement
Procedia PDF Downloads 1762575 Quantifying Multivariate Spatiotemporal Dynamics of Malaria Risk Using Graph-Based Optimization in Southern Ethiopia
Authors: Yonas Shuke Kitawa
Abstract:
Background: Although malaria incidence has substantially fallen sharply over the past few years, the rate of decline varies by district, time, and malaria type. Despite this turn-down, malaria remains a major public health threat in various districts of Ethiopia. Consequently, the present study is aimed at developing a predictive model that helps to identify the spatio-temporal variation in malaria risk by multiple plasmodium species. Methods: We propose a multivariate spatio-temporal Bayesian model to obtain a more coherent picture of the temporally varying spatial variation in disease risk. The spatial autocorrelation in such a data set is typically modeled by a set of random effects that assign a conditional autoregressive prior distribution. However, the autocorrelation considered in such cases depends on a binary neighborhood matrix specified through the border-sharing rule. Over here, we propose a graph-based optimization algorithm for estimating the neighborhood matrix that merely represents the spatial correlation by exploring the areal units as the vertices of a graph and the neighbor relations as the series of edges. Furthermore, we used aggregated malaria count in southern Ethiopia from August 2013 to May 2019. Results: We recognized that precipitation, temperature, and humidity are positively associated with the malaria threat in the area. On the other hand, enhanced vegetation index, nighttime light (NTL), and distance from coastal areas are negatively associated. Moreover, nonlinear relationships were observed between malaria incidence and precipitation, temperature, and NTL. Additionally, lagged effects of temperature and humidity have a significant effect on malaria risk by either species. More elevated risk of P. falciparum was observed following the rainy season, and unstable transmission of P. vivax was observed in the area. Finally, P. vivax risks are less sensitive to environmental factors than those of P. falciparum. Conclusion: The improved inference was gained by employing the proposed approach in comparison to the commonly used border-sharing rule. Additionally, different covariates are identified, including delayed effects, and elevated risks of either of the cases were observed in districts found in the central and western regions. As malaria transmission operates in a spatially continuous manner, a spatially continuous model should be employed when it is computationally feasible.Keywords: disease mapping, MSTCAR, graph-based optimization algorithm, P. falciparum, P. vivax, waiting matrix
Procedia PDF Downloads 772574 Next Generation UK Storm Surge Model for the Insurance Market: The London Case
Authors: Iacopo Carnacina, Mohammad Keshtpoor, Richard Yablonsky
Abstract:
Non-structural protection measures against flooding are becoming increasingly popular flood risk mitigation strategies. In particular, coastal flood insurance impacts not only private citizens but also insurance and reinsurance companies, who may require it to retain solvency and better understand the risks they face from a catastrophic coastal flood event. In this context, a framework is presented here to assess the risk for coastal flooding across the UK. The area has a long history of catastrophic flood events, including the Great Flood of 1953 and the 2013 Cyclone Xaver storm, both of which led to significant loss of life and property. The current framework will leverage a technology based on a hydrodynamic model (Delft3D Flexible Mesh). This flexible mesh technology, coupled with a calibration technique, allows for better utilisation of computational resources, leading to higher resolution and more detailed results. The generation of a stochastic set of extra tropical cyclone (ETC) events supports the evaluation of the financial losses for the whole area, also accounting for correlations between different locations in different scenarios. Finally, the solution shows a detailed analysis for the Thames River, leveraging the information available on flood barriers and levees. Two realistic disaster scenarios for the Greater London area are simulated: In the first scenario, the storm surge intensity is not high enough to fail London’s flood defences, but in the second scenario, London’s flood defences fail, highlighting the potential losses from a catastrophic coastal flood event.Keywords: storm surge, stochastic model, levee failure, Thames River
Procedia PDF Downloads 232