Search results for: storm events
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2245

Search results for: storm events

1795 A Lower Dose of Topiramate with Enough Antiseizure Effect: A Realistic Therapeutic Range of Topiramate

Authors: Seolah Lee, Yoohyk Jang, Soyoung Lee, Kon Chu, Sang Kun Lee

Abstract:

Objective: The International League Against Epilepsy (ILAE) currently suggests a topiramate serum level range of 5-20 mg/L. However, numerous institutions have observed substantial drug response at lower levels. This study aims to investigate the correlation between topiramate serum levels, drug responsiveness, and adverse events to establish a more accurate and tailored therapeutic range. Methods: We retrospectively analyzed topiramate serum samples collected between January 2017 and January 2022 at Seoul National University Hospital. Clinical data, including serum levels, antiseizure regimens, seizure frequency, and adverse events, were collected. Patient responses were categorized as "insufficient" (reduction in seizure frequency <50%) or "sufficient" (reduction ≥ 50%). Within the "sufficient" group, further subdivisions included seizure-free and tolerable seizure subgroups. A population pharmacokinetic model estimated serum levels from spot measurements. ROC curve analysis determined the optimal serum level cut-off. Results: A total of 389 epilepsy patients, with 555 samples, were reviewed, having a mean dose of 178.4±117.9 mg/day and a serum level of 3.9±2.8 mg/L. Out of the samples, only 5.6% (n=31) exhibited insufficient response, with a mean serum level of 3.6±2.5 mg/L. In contrast, 94.4% (n=524) of samples demonstrated sufficient response, with a mean serum level of 4.0±2.8 mg/L. This difference was not statistically significant (p = 0.45). Among the 78 reported adverse events, logistic regression analysis identified a significant association between ataxia and serum concentration (p = 0.04), with an optimal cut-off value of 6.5 mg/L. In the subgroup of patients receiving monotherapy, those in the tolerable seizure group exhibited a significantly higher serum level compared to the seizure-free group (4.8±2.0 mg/L vs 3.4±2.3 mg/L, p < 0.01). Notably, patients in the tolerable seizure group displayed a higher likelihood of progressing into drug-resistant epilepsy during follow-up visits compared to the seizure-free group. Significance: This study proposed an optimal therapeutic concentration for topiramate based on the patient's responsiveness to the drug and the incidence of adverse effects. We employed a population pharmacokinetic model and analyzed topiramate serum levels to recommend a serum level below 6.5 mg/L to mitigate the risk of ataxia-related side effects. Our findings also indicated that topiramate dose elevation is unnecessary for suboptimal responders, as the drug's effectiveness plateaus at minimal doses.

Keywords: topiramate, therapeutic range, low dos, antiseizure effect

Procedia PDF Downloads 50
1794 Big Data and Cardiovascular Healthcare Management: Recent Advances, Future Potential and Pitfalls

Authors: Maariyah Irfan

Abstract:

Intro: Current cardiovascular (CV) care faces challenges such as low budgets and high hospital admission rates. This review aims to evaluate Big Data in CV healthcare management through the use of wearable devices in atrial fibrillation (AF) detection. AF may present intermittently, thus it is difficult for a healthcare professional to capture and diagnose a symptomatic rhythm. Methods: The iRhythm ZioPatch, AliveCor portable electrocardiogram (ECG), and Apple Watch were chosen for review due to their involvement in controlled clinical trials, and their integration with smartphones. The cost-effectiveness and AF detection of these devices were compared against the 12-lead ambulatory ECG (Holter monitor) that the NHS currently employs for the detection of AF. Results: The Zio patch was found to detect more arrhythmic events than the Holter monitor over a 2-week period. When patients presented to the emergency department with palpitations, AliveCor portable ECGs detected 6-fold more symptomatic events compared to the standard care group over 3-months. Based off preliminary results from the Apple Heart Study, only 0.5% of participants received irregular pulse notifications from the Apple Watch. Discussion: The Zio Patch and AliveCor devices have promising potential to be implemented into the standard duty of care offered by the NHS as they compare well to current routine measures. Nonetheless, companies must address the discrepancy between their target population and current consumers as those that could benefit the most from the innovation may be left out due to cost and access.

Keywords: atrial fibrillation, big data, cardiovascular healthcare management, wearable devices

Procedia PDF Downloads 128
1793 Building Collapse: Factors and Resisting Mechanisms: A Review of Case Studies

Authors: Genevieve D. Fernandes, Nisha P. Naik

Abstract:

All through the ages in all human civilizations, men have been engaged in construction activity, not only to build their dwellings and house their activities, but also roads, bridges to facilitate means of transport, and communication etc. The main concern in this activity was to ensure safety and reduce the collapse of the buildings and other structures. But even after taking all precautions, it is impossible to guarantee safety and collapse because of several unforeseen reasons like faulty constructions, design errors, overloading, soil liquefaction, gas explosion, material degradation, terrorist attacks and economic factors also contributing to the collapse. It is also uneconomical to design the structure for unforeseen events unless they have a reasonable chance of occurrence. In order to ensure safety and prevent collapse, many guidelines have been framed by local bodies and government authorities in many countries like the United States Department of Defence (DOD), United States General Service Administration (GSA) and Euro-Codes in European Nations. Some other practices are followed to incorporate redundancies in the structure like detailing, ductile designs, tying of elements at particular locations, and provision of hinges and interconnections. It is also to be admitted that a full-proof safe design structure for accidental events cannot be prepared and implemented as it is uneconomical and the chances of such occurrences are less. This paper reviews past case studies of the collapse of structures with the aim of developing an understanding of the collapse mechanism. This study will definitely help to bring about a detailed improvement in the design to maximise the quality of the construction at a minimal cost.

Keywords: unforeseen factors, progressive collapse, collapse resisting mechanisms, column removal scenario

Procedia PDF Downloads 127
1792 Evaluation of the Beach Erosion Process in Varadero, Matanzas, Cuba: Effects of Different Hurricane Trajectories

Authors: Ana Gabriela Diaz, Luis Fermín Córdova, Jr., Roberto Lamazares

Abstract:

The island of Cuba, the largest of the Greater Antilles, is located in the tropical North Atlantic. It is annually affected by numerous weather events, which have caused severe damage to our coastal areas. In the same way that many other coastlines around the world, the beautiful beaches of the Hicacos Peninsula also suffer from erosion. This leads to a structural regression of the coastline. If measures are not taken, the hotels will be exposed to the advance of the sea, and it will be a serious problem for the economy. With the aim of studying the intensity of this type of activity, specialists of group of coastal and marine engineering from CIH, in the framework of the research conducted within the project MEGACOSTAS 2, provide their research to simulate extreme events and assess their impact in coastal areas, mainly regarding the definition of flood volumes and morphodynamic changes in sandy beaches. The main objective of this work is the evaluation of the process of Varadero beach erosion (the coastal sector has an important impact in the country's economy) on the Hicacos Peninsula for different paths of hurricanes. The mathematical model XBeach, which was integrated into the Coastal engineering system introduced by the project of MEGACOSTA 2 to determine the area and the more critical profiles for the path of hurricanes under study, was applied. The results of this project have shown that Center area is the greatest dynamic area in the simulation of the three paths of hurricanes under study, showing high erosion volumes and the greatest average length of regression of the coastline, from 15- 22 m.

Keywords: beach, erosion, mathematical model, coastal areas

Procedia PDF Downloads 222
1791 A Molecular Dynamics Study on Intermittent Plasticity and Dislocation Avalanche Emissions in FCC and BCC Crystals

Authors: Javier Varillas, Jorge Alcalá

Abstract:

We investigate dislocation avalanche phenomena in face-centered cubic (FCC) and body-centered cubic (BCC) crystals using massive, large-scale molecular dynamics (MD) simulations. The analysis is focused on the intermittent development of dense dislocation arrangements subjected to uniaxial tensile straining under displacement control. We employ a novel computational scheme that allows us to inject an entangled dislocation structure in periodic MD domains. We assess the emission of plastic bursts (or dislocation avalanches) in terms of the sharp stress drops detected in the stress-strain curve. The plastic activity corresponds to the sporadic operation of specific dislocation glide processes exhibiting quiescent periods between successive avalanche events. We find that the plastic intermittences in our simulations do not overlap in time under sufficiently low strain rates as dissipation operates faster than driving, where the dense dislocation networks evolve through the emission of dislocation avalanche events whose carried slip adheres to self-organized power-law distributions. These findings enable the extension of the slip distributions obtained from strict displacement-controlled micropillar compression experiments towards smaller values of slip size. Our results furnish further understanding upon the development of entangled dislocation networks in metal plasticity, including specific mechanisms of dislocation propagation and annihilation, along with the evolution of specific dislocation populations through dislocation density analyses.

Keywords: dislocations, intermittent plasticity, molecular dynamics, slip distributions

Procedia PDF Downloads 136
1790 The Greek Revolution Through the Foreign Press. The Case of the Newspaper "The London Times" In the Period 1821-1828

Authors: Euripides Antoniades

Abstract:

In 1821 the Greek Revolution movement, under the political influence that arose from the French revolution, and the corresponding movements in Italy, Germany and America, requested the liberation of the nation and the establishment of an independent national state. Published topics in the British press regarding the Greek Revolution, focused on : a) the right of the Greeks to claim their freedom from Turkish domination in order to establish an independent state based on the principle of national autonomy, b) criticism regarding Turkish rule as illegal and the power of the Ottoman Sultan as arbitrary, c) the recognition of the Greek identity and its distinction from the Turkish one and d) the endorsement Greeks as the descendants of ancient Greeks. The advantage of newspaper as a media is sharing information and ideas and dealing with issues in greater depth and detail, unlike other media, such as radio or television. The London Times is a print publication that presents, in chronological or thematic order, the news, opinions or announcements about the most important events that have occurred in a place during a specified period of time. This paper employs the rich archive of The London Times archive by quoting extracts from publications of that period, to convey the British public perspective regarding the Greek Revolution from its beginning until the London Protocol of 1828. Furthermore, analyses the publications of the British newspaper in terms of the number of references to the Greek revolution, front page and editorial references as well as the size of publications on the revolution during the period 1821-1828. A combination of qualitative and quantitative content analysis was applied. An attempt was made to record Greek Revolution references along with the usage of specific words and expressions that contribute to the representation of the historical events and their exposure to the reading public. Key finds of this research reveal that a) there was a frequency of passionate daily articles concerning the events in Greece, their length, and context in The London Times, b) the British public opinion was influenced by this particular newspaper and c) the newspaper published various news about the revolution by adopting the role of animator of the Greek struggle. For instance, war events and the battles of Wallachin and Moldavia, Hydra, Crete, Psara, Mesollogi, Peloponnese were presented not only for informing the readers but for promoting the essential need for freedom and the establishment of an independent Greek state. In fact, this type of news was the main substance of the The London Times’ structure, establishing a positive image about the Greek Revolution contributing to the European diplomatic development such as the standpoint of France, - that did not wish to be detached from the conclusions regarding the English loans and the death of Alexander I of Russia and his succession by the ambitious Nicholas. These factors offered a change in the attitude of the British and Russians respectively assuming a positive approach towards Greece. The Great Powers maintained a neutral position in the Greek-Ottoman conflict, same time they engaged in Greek power increasement by offering aid.

Keywords: Greece, revolution, newspaper, the London times, London, great britain, mass media

Procedia PDF Downloads 84
1789 How the Current Opioid Crisis Differs from the Heroin Epidemic of the 1960s-1970s: An Analysis of Drugs and Demographics

Authors: Donna L. Roberts

Abstract:

Heroin has appeared on the drug scene before. Yet the current opioid crisis differs in significant ways. In order to address the grave challenges, this epidemic poses, the unique precipitating and sustaining conditions must be thoroughly examined. This research explored the various aspects of the political, economic, and social conditions that created a 'perfect storm' for the evolution and maintenance of the current opioid crisis. Specifically, the epidemiology, demographics, and progression of addiction inherent in the current crisis were compared to the patterns of past opioid use. Additionally, the role of pharmaceutical companies and prescribing physicians, the nature and pharmaceutical properties of the available substances and the changing socioeconomic climate were considered. Results indicated that the current crisis differs significantly with respect to its evolution, magnitude, prevalence, and widespread societal effects. Precipitated by a proliferation of prescription medication and sustained by the availability of cheaper, more potent street drugs, including new versions of synthetic opioids, the current crisis presents unprecedented challenges affecting a wider and more diverse segment of society. The unique aspects of this epidemic demand unique approaches to addressing the problem. Understanding these differences is a key step in working toward a practical and enduring solution.

Keywords: addiction, drug abuse, opioids, opioid crisis

Procedia PDF Downloads 146
1788 Application of Simulation of Discrete Events in Resource Management of Massive Concreting

Authors: Mohammad Amin Hamedirad, Seyed Javad Vaziri Kang Olyaei

Abstract:

Project planning and control are one of the most critical issues in the management of construction projects. Traditional methods of project planning and control, such as the critical path method or Gantt chart, are not widely used for planning projects with discrete and repetitive activities, and one of the problems of project managers is planning the implementation process and optimal allocation of its resources. Massive concreting projects is also a project with discrete and repetitive activities. This study uses the concept of simulating discrete events to manage resources, which includes finding the optimal number of resources considering various limitations such as limitations of machinery, equipment, human resources and even technical, time and implementation limitations using analysis of resource consumption rate, project completion time and critical points analysis of the implementation process. For this purpose, the concept of discrete-event simulation has been used to model different stages of implementation. After reviewing the various scenarios, the optimal number of allocations for each resource is finally determined to reach the maximum utilization rate and also to reduce the project completion time or reduce its cost according to the existing constraints. The results showed that with the optimal allocation of resources, the project completion time could be reduced by 90%, and the resulting costs can be reduced by up to 49%. Thus, allocating the optimal number of project resources using this method will reduce its time and cost.

Keywords: simulation, massive concreting, discrete event simulation, resource management

Procedia PDF Downloads 142
1787 Identification Algorithm of Critical Interface, Modelling Perils on Critical Infrastructure Subjects

Authors: Jiří. J. Urbánek, Hana Malachová, Josef Krahulec, Jitka Johanidisová

Abstract:

The paper deals with crisis situations investigation and modelling within the organizations of critical infrastructure. Every crisis situation has an origin in the emergency event occurrence in the organizations of energetic critical infrastructure especially. Here, the emergency events can be both the expected events, then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping or the unexpected event (Black Swan effect) – without pre-prepared scenario, but it needs operational coping of crisis situations as well. The forms, characteristics, behaviour and utilization of crisis scenarios have various qualities, depending on real critical infrastructure organization prevention and training processes. An aim is always better organizational security and continuity obtainment. This paper objective is to find and investigate critical/ crisis zones and functions in critical situations models of critical infrastructure organization. The DYVELOP (Dynamic Vector Logistics of Processes) method is able to identify problematic critical zones and functions, displaying critical interfaces among actors of crisis situations on the DYVELOP maps named Blazons. Firstly, for realization of this ability is necessary to derive and create identification algorithm of critical interfaces. The locations of critical interfaces are the flags of crisis situation in real organization of critical infrastructure. Conclusive, the model of critical interface will be displayed at real organization of Czech energetic crisis infrastructure subject in Black Out peril environment. The Blazons need live power Point presentation for better comprehension of this paper mission.

Keywords: algorithm, crisis, DYVELOP, infrastructure

Procedia PDF Downloads 405
1786 Comparison of Low Velocity Impact Test on Coir Fiber Reinforced Polyester Composites

Authors: Ricardo Mendoza, Jason Briceño, Juan F. Santa, Gabriel Peluffo, Mauricio Márquez, Beatriz Cardozo, Carlos Gutiérrez

Abstract:

The most common controlled method to obtain impact strength of composites materials is performing a Charpy Impact Test which consists of a pendulum with calibrated mass and length released from a known height. In fact, composites components experience impact events in normal operations such as when a tool drops or a foreign object strikes it. These events are categorized into low velocity impact (LVI) which typically occurs at velocities below 10m/s. In this study, the major aim was to calculate the absorbed energy during the impact. Tests were performed on three types of composite panels: fiberglass laminated panels, coir fiber reinforced polyester and coir fiber reinforced polyester subjected to water immersion for 48 hours. Coir fibers were obtained in local plantations of the Caribbean coast of Colombia. They were alkali treated in 5% aqueous NaOH solution for 2h periods. Three type of shape impactors were used on drop-weight impact test including hemispherical, ogive and pointed. Failure mechanisms and failure modes of specimens were examined using an optical microscope. Results demonstrate a reduction in absorbed energy correlated with the increment of water absorption of the panels. For each level of absorbed energy, it was possible to associate a different fracture state. This study compares results of energy absorbed obtained from two impact test methods.

Keywords: coir fiber, polyester composites, low velocity impact, Charpy impact test, drop-weight impact test

Procedia PDF Downloads 448
1785 Patient Agitation and Violence in Medical-Surgical Settings at BronxCare Hospital, Before and During COVID-19 Pandemic; A Retrospective Chart Review

Authors: Soroush Pakniyat-Jahromi, Jessica Bucciarelli, Souparno Mitra, Neda Motamedi, Ralph Amazan, Samuel Rothman, Jose Tiburcio, Douglas Reich, Vicente Liz

Abstract:

Violence is defined as an act of physical force that is intended to cause harm and may lead to physical and/or psychological damage. Violence toward healthcare workers (HCWs) is more common in psychiatric settings, emergency departments, and nursing homes; however, healthcare workers in medical setting are not spared from such events. Workplace violence has a huge burden in the healthcare industry and has a major impact on the physical and mental wellbeing of staff. The purpose of this study is to compare the prevalence of patient agitation and violence in medical-surgical settings in BronxCare Hospital (BCH) Bronx, New York, one year before and during the COVID-19 pandemic. Data collection occurred between June 2021 and August 2021, while the sampling time was from 2019 to 2021. The data were separated into two separate time categories: pre-COVID-19 (03/2019-03/2020) and COVID-19 (03/2020-03/2021). We created frequency tables for 19 variables. We used a chi-square test to determine a variable's statistical significance. We tested all variables against “restraint type”, determining if a patient was violent or became violent enough to restrain. The restraint types were “chemical”, “physical”, or both. This analysis was also used to determine if there was a statistical difference between the pre-COVID-19 and COVID-19 timeframes. Our data shows that there was an increase in incidents of violence in COVID-19 era (03/2020-03/2021), with total of 194 (62.8%) reported events, compared to pre COVID-19 era (03/2019-03/2020) with 115 (37.2%) events (p: 0.01). Our final analysis, completed using a chi-square test, determined the difference in violence in patients between pre-COVID-19 and COVID-19 era. We then tested the violence marker against restraint type. The result was statistically significant (p: 0.01). This is the first paper to systematically review the prevalence of violence in medical-surgical units in a hospital in New York, pre COVID-19 and during the COVID-19 era. Our data is in line with the global trend of increased prevalence of patient agitation and violence in medical settings during the COVID-19 pandemic. Violence and its management is a challenge in healthcare settings, and the COVID-19 pandemic has brought to bear a complexity of circumstances, which may have increased its incidence. It is important to identify and teach healthcare workers the best preventive approaches in dealing with patient agitation, to decrease the number of restraints in medical settings, and to create a less restrictive environment to deliver care.

Keywords: COVID-19 pandemic, patient agitation, restraints, violence

Procedia PDF Downloads 140
1784 A Corporate Social Responsibility View on Bribery Control in Business Relationships

Authors: Irfan Ameer

Abstract:

Bribery control in developing countries is the biggest challenge for multinational enterprises (MNEs). Bribery practices are socially embedded and institutionalized, and therefore may achieve collective legitimacy in the society. MNEs often have better and strict norms, codes and standards about such corrupt practices. Bribery in B2B sales relationships has been researched but studies focusing on the role of firm in controlling bribery are scarce. The main objective of this paper is to explore MNEs strategies to control bribery in an environment where bribery is institutionalized. This qualitative study uses narrative approach and focuses on key events, actors and their role in controlling bribery in B2B sales relationships. The context of this study is pharmaceutical industry of Pakistan and data is collected through 23 episodic interviews supported by secondary data. The Corporate social responsibility (CSR) literature e.g. CSR three domain model and CSR pyramid is used to make sense of MNEs strategies to control bribery in developing countries. Results show that MNEs’ bribery control strategies are rather emerging based on the role of some key stakeholders and events which shape bribery strategies. Five key bribery control strategies were found through which MNEs can control both demand and supply side of bribery: bribery related codes development; bribery related codes implementation; focusing on competitive advantage; find mutually beneficial ethical solution; and collaboration with ethical stakeholders. The results also highlight the problems associated with each strategy. Study is unique in a sense that it focuses on stakeholders having unethical interests and provides guidelines to MNEs in controlling bribery practices in B2B sales relationships.

Keywords: bribery, developing countries, CSR, narrative research, B2B sales, MNEs

Procedia PDF Downloads 370
1783 Fake Accounts Detection in Twitter Based on Minimum Weighted Feature Set

Authors: Ahmed ElAzab, Amira M. Idrees, Mahmoud A. Mahmoud, Hesham Hefny

Abstract:

Social networking sites such as Twitter and Facebook attracts over 500 million users across the world, for those users, their social life, even their practical life, has become interrelated. Their interaction with social networking has affected their life forever. Accordingly, social networking sites have become among the main channels that are responsible for vast dissemination of different kinds of information during real time events. This popularity in Social networking has led to different problems including the possibility of exposing incorrect information to their users through fake accounts which results to the spread of malicious content during life events. This situation can result to a huge damage in the real world to the society in general including citizens, business entities, and others. In this paper, we present a classification method for detecting fake accounts on Twitter. The study determines the minimized set of the main factors that influence the detection of the fake accounts on Twitter, then the determined factors have been applied using different classification techniques, a comparison of the results for these techniques has been performed and the most accurate algorithm is selected according to the accuracy of the results. The study has been compared with different recent research in the same area, this comparison has proved the accuracy of the proposed study. We claim that this study can be continuously applied on Twitter social network to automatically detect the fake accounts, moreover, the study can be applied on different Social network sites such as Facebook with minor changes according to the nature of the social network which are discussed in this paper.

Keywords: fake accounts detection, classification algorithms, twitter accounts analysis, features based techniques

Procedia PDF Downloads 403
1782 Two-Sided Information Dissemination in Takeovers: Disclosure and Media

Authors: Eda Orhun

Abstract:

Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.

Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success

Procedia PDF Downloads 313
1781 Natural Law in the Mu’Tazilite Theology

Authors: Samaneh Khalili

Abstract:

Natural law theory, in moral philosophy, refers to a system of unchanging values held to be mutual to all humans and can be discovered through reason. The natural law theory is commonly associated with western Philosophers. In contrast, discussions on notions of natural law in Islamic intellectual history were relatively rare. This paper aims to show that the moral theory developed by the Mu'tazilite thinkers can be classified in the ideas of natural law. In doing so, this study will demonstrate that the objective and unchanging values, according to Mu'tazilite theologians, provide the guidelines for assessing the Islamic law rules in the field of human coexistence. The focus of the paper lies on ʿAbd al-Ğabbār, who was the most influential thinker in the late epoch of the Muʿtazila. Although ʿAbd al-Ǧabbār did not leave a text with a systematic discussion of natural law, his teaching of nature, human reason, and the moral values of actions are all scattered throughout his work -'al-Muġnī fī abwāb at-tawḥīd wa-l-'adl'. It is necessary to focus on ʿAbd al-Ǧabbār's theories on reason, nature, and ethics since natural law revolves around the basic concepts of nature, reason, and moral value. While analyzing the concept of the Nature, it will attempt to answer how he explains the world's physical structure and God's relationship to natural events. Moreover, from ʿAbd al-Ǧabbār's point of view, is nature a self-determined system that follows its inner principle in every kind of change, or is nature guided by an external power? Does causality govern natural events? About the concept of reason, an attempt is made to examine how human reason, according to ʿAbd al-Ǧabbār, conceives moral attributes. Finally, the Autor will discuss the concepts of objective values and the place of rights and duties derived from Islamic law in ʿAbd al-Ǧabbār's thought.

Keywords: Islamic law, Mu'tazilite theology, natural law in Islamic theology, objective and unchanging values.

Procedia PDF Downloads 88
1780 Rupture Termination of the 1950 C. E. Earthquake and Recurrent Interval of Great Earthquake in North Eastern Himalaya, India

Authors: Rao Singh Priyanka, Jayangondaperumal R.

Abstract:

The Himalayan active fault has the potential to generate great earthquakes in the future, posing a biggest existential threat to humans in the Himalayan and adjacent region. Quantitative evaluation of accumulated and released interseismic strain is crucial to assess the magnitude and spatio-temporal variability of future great earthquakes along the Himalayan arc. To mitigate the destruction and hazards associated with such earthquakes, it is important to understand their recurrence cycle. The eastern Himalayan and Indo-Burman plate boundary systems offers an oblique convergence across two orthogonal plate boundaries, resulting in a zone of distributed deformation both within and away from the plate boundary and clockwise rotation of fault-bounded blocks. This seismically active region has poorly documented historical archive of the past large earthquakes. Thus, paleoseismologicalstudies confirm the surface rupture evidences of the great continental earthquakes (Mw ≥ 8) along the Himalayan Frontal Thrust (HFT), which along with the Geodetic studies, collectively provide the crucial information to understand and assess the seismic potential. These investigations reveal the rupture of 3/4th of the HFT during great events since medieval time but with debatable opinions for the timing of events due to unclear evidences, ignorance of transverse segment boundaries, and lack of detail studies. Recent paleoseismological investigations in the eastern Himalaya and Mishmi ranges confirms the primary surface ruptures of the 1950 C.E. great earthquake (M>8). However, a seismic gap exists between the 1714 C.E. and 1950 C.E. Assam earthquakes that did not slip since 1697 C.E. event. Unlike the latest large blind 2015 Gorkha earthquake (Mw 7.8), the 1950 C.E. event is not triggered by a large event of 1947 C.E. that occurred near the western edge of the great upper Assam event. Moreover, the western segment of the eastern Himalayadid not witness any surface breaking earthquake along the HFT for over the past 300 yr. The frontal fault excavations reveal that during the 1950 earthquake, ~3.1-m-high scarp along the HFT was formed due to the co-seismic slip of 5.5 ± 0.7 m at Pasighat in the Eastern Himalaya and a 10-m-high-scarp at a Kamlang Nagar along the Mishmi Thrust in the Eastern Himalayan Syntaxis is an outcome of a dip-slip displacement of 24.6 ± 4.6 m along a 25 ± 5°E dipping fault. This event has ruptured along the two orthogonal fault systems in the form of oblique thrust fault mechanism. Approx. 130 km west of Pasighat site, the Himebasti village has witnessed two earthquakes, the historical 1697 Sadiya earthquake, and the 1950 event, with a cumulative dip-slip displacement of 15.32 ± 4.69 m. At Niglok site, Arunachal Pradesh, a cumulative slip of ~12.82 m during at least three events since pre 19585 B.P. has produced ~6.2-m high scarp while the youngest scarp of ~2.4-m height has been produced during 1697 C.E. The site preserves two deformational events along the eastern HFT, providing an idea of last serial ruptures at an interval of ~850 yearswhile the successive surface rupturing earthquakes lacks in the Mishmi Range to estimate the recurrence cycle.

Keywords: paleoseismology, surface rupture, recurrence interval, Eastern Himalaya

Procedia PDF Downloads 80
1779 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making

Authors: Ayham Fattoum, Simos Chari, Duncan Shaw

Abstract:

Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.

Keywords: Intuition, complexity management, decision-making, viable system model

Procedia PDF Downloads 64
1778 Epigenetic and Archeology: A Quest to Re-Read Humanity

Authors: Salma A. Mahmoud

Abstract:

Epigenetic, or alteration in gene expression influenced by extragenetic factors, has emerged as one of the most promising areas that will address some of the gaps in our current knowledge in understanding patterns of human variation. In the last decade, the research investigating epigenetic mechanisms in many fields has flourished and witnessed significant progress. It paved the way for a new era of integrated research especially between anthropology/archeology and life sciences. Skeletal remains are considered the most significant source of information for studying human variations across history, and by utilizing these valuable remains, we can interpret the past events, cultures and populations. In addition to archeological, historical and anthropological importance, studying bones has great implications in other fields such as medicine and science. Bones also can hold within them the secrets of the future as they can act as predictive tools for health, society characteristics and dietary requirements. Bones in their basic forms are composed of cells (osteocytes) that are affected by both genetic and environmental factors, which can only explain a small part of their variability. The primary objective of this project is to examine the epigenetic landscape/signature within bones of archeological remains as a novel marker that could reveal new ways to conceptualize chronological events, gender differences, social status and ecological variations. We attempted here to address discrepancies in common variants such as methylome as well as novel epigenetic regulators such as chromatin remodelers, which to our best knowledge have not yet been investigated by anthropologists/ paleoepigenetists using plethora of techniques (biological, computational, and statistical). Moreover, extracting epigenetic information from bones will highlight the importance of osseous material as a vector to study human beings in several contexts (social, cultural and environmental), and strengthen their essential role as model systems that can be used to investigate and construct various cultural, political and economic events. We also address all steps required to plan and conduct an epigenetic analysis from bone materials (modern and ancient) as well as discussing the key challenges facing researchers aiming to investigate this field. In conclusion, this project will serve as a primer for bioarcheologists/anthropologists and human biologists interested in incorporating epigenetic data into their research programs. Understanding the roles of epigenetic mechanisms in bone structure and function will be very helpful for a better comprehension of their biology and highlighting their essentiality as interdisciplinary vectors and a key material in archeological research.

Keywords: epigenetics, archeology, bones, chromatin, methylome

Procedia PDF Downloads 104
1777 The Development of an Anaesthetic Crisis Manual for Acute Critical Events: A Pilot Study

Authors: Jacklyn Yek, Clara Tong, Shin Yuet Chong, Yee Yian Ong

Abstract:

Background: While emergency manuals and cognitive aids (CA) have been used in high-hazard industries for decades, this has been a nascent field in healthcare. CAs can potentially offset the large cognitive load involved in crisis resource management and possibly facilitate the efficient performance of key steps in treatment. A crisis manual was developed based on local guidelines and the latest evidence-based information and introduced to a tertiary hospital setting in Singapore. Hence, the objective of this study is to evaluate the effectiveness of the crisis manual in guiding response and management of critical events. Methods: 7 surgical teams were recruited to participate in a series of simulated emergencies in high-fidelity operating room simulator over the period of April to June 2018. All teams consisted of a surgical consultant and medical officer/registrar, anesthesia consultant and medical officer/registrar; as well as a circulating, scrub and anesthetic nurse. Each team performed a simulated operation in which 1 or more of the crisis events occurred. The teams were randomly assigned to a scenario of the crisis manual and all teams were deemed to be equal in experience and knowledge. Before the simulation, teams were instructed on proper checklist use but the use of the checklist was optional. Results: 7 simulation sessions were performed, consisting of the following scenarios: Airway fire, Massive Transfusion Protocol, Malignant Hyperthermia, Eclampsia, and Difficult Airway. Out of the 7 surgical teams, 2 teams made use of the crisis manual – of which both teams had encountered a ‘Malignant Hyperthermia’ scenario. These team members reflected that the crisis manual assisted allowed them to work in a team, especially being able to involve the surgical doctors who were unfamiliar with the condition and management. A run chart plotted showed a possible upward trend, suggesting that with increasing awareness and training, staff would become more likely to initiate the use of the crisis manual. Conclusion: Despite the high volume load in this tertiary hospital, certain crises remain rare and clinicians are often caught unprepared. A crisis manual is an effective tool and easy-to-use repository that can improve patient outcome and encourage teamwork. With training, familiarity would allow clinicians to be increasingly comfortable with reaching out for the crisis manual. More simulation training would need to be conducted to determine its effectiveness.

Keywords: crisis resource management, high fidelity simulation training, medical errors, visual aids

Procedia PDF Downloads 120
1776 Sweepline Algorithm for Voronoi Diagram of Polygonal Sites

Authors: Dmitry A. Koptelov, Leonid M. Mestetskiy

Abstract:

Voronoi Diagram (VD) of finite set of disjoint simple polygons, called sites, is a partition of plane into loci (for each site at the locus) – regions, consisting of points that are closer to a given site than to all other. Set of polygons is a universal model for many applications in engineering, geoinformatics, design, computer vision, and graphics. VD of polygons construction usually done with a reduction to task of constructing VD of segments, for which there are effective O(n log n) algorithms for n segments. Preprocessing – constructing segments from polygons’ sides, and postprocessing – polygon’s loci construction by merging the loci of the sides of each polygon are also included in reduction. This approach doesn’t take into account two specific properties of the resulting segment sites. Firstly, all this segments are connected in pairs in the vertices of the polygons. Secondly, on the one side of each segment lies the interior of the polygon. The polygon is obviously included in its locus. Using this properties in the algorithm for VD construction is a resource to reduce computations. The article proposes an algorithm for the direct construction of VD of polygonal sites. Algorithm is based on sweepline paradigm, allowing to effectively take into account these properties. The solution is performed based on reduction. Preprocessing is the constructing of set of sites from vertices and edges of polygons. Each site has an orientation such that the interior of the polygon lies to the left of it. Proposed algorithm constructs VD for set of oriented sites with sweepline paradigm. Postprocessing is a selecting of edges of this VD formed by the centers of empty circles touching different polygons. Improving the efficiency of the proposed sweepline algorithm in comparison with the general Fortune algorithm is achieved due to the following fundamental solutions: 1. Algorithm constructs only such VD edges, which are on the outside of polygons. Concept of oriented sites allowed to avoid construction of VD edges located inside the polygons. 2. The list of events in sweepline algorithm has a special property: the majority of events are connected with “medium” polygon vertices, where one incident polygon side lies behind the sweepline and the other in front of it. The proposed algorithm processes such events in constant time and not in logarithmic time, as in the general Fortune algorithm. The proposed algorithm is fully implemented and tested on a large number of examples. The high reliability and efficiency of the algorithm is also confirmed by computational experiments with complex sets of several thousand polygons. It should be noted that, despite the considerable time that has passed since the publication of Fortune's algorithm in 1986, a full-scale implementation of this algorithm for an arbitrary set of segment sites has not been made. The proposed algorithm fills this gap for an important special case - a set of sites formed by polygons.

Keywords: voronoi diagram, sweepline, polygon sites, fortunes' algorithm, segment sites

Procedia PDF Downloads 171
1775 Efficacy and Safety of Probiotic Treatment in Patients with Liver Cirrhosis: A Systematic Review and Meta-Analysis

Authors: Samir Malhotra, Rajan K. Khandotra, Rakesh K. Dhiman, Neelam Chadha

Abstract:

There is paucity of data about safety and efficacy of probiotic treatment on patient outcomes in cirrhosis. Specifically, it is important to know whether probiotics can improve mortality, hepatic encephalopathy (HE), number of hospitalizations, ammonia levels, quality of life, and adverse events. Probiotics may improve outcomes in patients with acute or chronic HE. However, it is also important to know whether probiotics can prevent development of HE, even in situations where patients do not have acute HE at the time of administration. It is also important to know if probiotics are useful as primary prophylaxis of HE. We aimed to conduct an updated systematic review and meta-analysis to evaluate the safety and efficacy of probiotics in patients with cirrhosis. We searched PubMed, Cochrane library, Embase, Scopus, SCI, Google Scholar, conference proceedings, and references of included studies till June 2017 to identify randomised clinical trials comparing probiotics with other treatments in cirrhotics. Data was analyzed using MedCalc. Probiotics had no effect on mortality but significantly reduced HE (14 trials, 1073 patients, OR 0.371; 95% CI 0.282 to 0.489). There was not enough data to conduct a meta-analysis on outcomes like hospitalizations and quality of life. The effect on plasma ammonia levels was not significant (SMD -0.429; 95%CI -1.034 – 0.177). There was no difference in adverse events. To conclude, although the included studies had a high risk of bias, the available evidence does suggest a beneficial effect on HE. Larger studies with longer periods of follow-up are needed to determine if probiotics can reduce all-cause mortality.

Keywords: cirrhosis, hepatic encephalopathy, meta-analysis, probiotic

Procedia PDF Downloads 196
1774 Quality of Care of Medical Male Circumcisions: A Non-Negotiable for Right to Care

Authors: Nelson Igaba, C. Onaga, S. Hlongwane

Abstract:

Background: Medical Male Circumcision (MMC) is part of a comprehensive HIV prevention strategy. The quality of MMC done at Right To Care (RtC) sites is maintained by Continuous Quality Improvement (CQI) based on findings of assessments by internal and independent external assessors who evaluate such parameters as the quality of the surgical procedure, infection control, etc. There are 12 RtC MMC teams in Mpumalanga, two of which are headed by Medical Officers and 10 by Clinical Associates (Clin A). Objectives: To compare the quality (i) of care rendered at doctor headed sites (DHS) versus Clin A headed sites (CHS); (ii) of CQI assessments (external versus internal). Methodology: A retrospective review of data from RightMax™ (a novel RtC data management system) and CQI reports (external and internal) was done. CQI assessment scores of October 2015 and October 2016 were taken as the baseline and latest respectively. Four sites with 745-810 circumcisions per annum were purposively selected; the two DHS (group A) and two CHS (group B). Statistical analyses were conducted using R (2017 version). Results: There were no significant difference in latest CQI scores between the two groups (DHS and CHS) (Anova, F = 1.97, df = 1, P = 0.165); between internal and external CQI assessment scores (Anova, F = 2.251, df = 1, P = 0.139) or among the individual sites (Anova, F = 1.095, df = 2, P = 0.341). Of the total of 16 adverse events reported by the four sites in the 12 months reviewed (all were infections), there was no statistical evidence that the documented severity of the infection was different for DHS and CHS (Fisher’s exact test, p-value = 0.269). Conclusion: At RtC VMMC sites in Mpumalanga, internal and external/independent CQI assessments are comparable, and quality of care of VMMC is standardized with the performance of well-supervised clinical associates comparing well with those of medical officers.

Keywords: adverse events, Right to Care, male medical circumcision, continuous quality improvement

Procedia PDF Downloads 168
1773 Extreme Value Theory Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. In this paper, the results for the reliability of diesel generator fans were calculated through Extreme Value Theory. The Extreme Value Theory is not widely used in the engineering field. Its usage is well known in other areas such as hydrology, meteorology, finance. The significance of this theory is in the fact that unlike the other statistical methods it is focused on rare and extreme values, and not on average. It should be noted that this theory is not designed exclusively for extreme events, but for extreme values in any event. Therefore, this is a great opportunity to apply the theory and test if it could be applied in this situation. The significance of the work is the calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know the time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. The results achieved in this method will show the approximation of time for which the fans will work as they should, and the percentage of probability of fans working more than certain estimated time. Extreme Value Theory can be applied not only for rare and extreme events, but for any event that has values which we can consider as extreme.

Keywords: extreme value theory, lifetime, reliability analysis, statistic, time to failure

Procedia PDF Downloads 323
1772 Agricultural Water Consumption Estimation in the Helmand Basin

Authors: Mahdi Akbari, Ali Torabi Haghighi

Abstract:

Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.

Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation

Procedia PDF Downloads 125
1771 [Keynote Talk]: Unlocking Transformational Resilience in the Aftermath of a Flood Disaster: A Case Study from Cumbria

Authors: Kate Crinion, Martin Haran, Stanley McGreal, David McIlhatton

Abstract:

Past research has demonstrated that disasters are continuing to escalate in frequency and magnitude worldwide, representing a key concern for the global community. Understanding and responding to the increasing risk posed by disaster events has become a key concern for disaster managers. An emerging trend within literature, acknowledges the need to move beyond a state of coping and reinstatement of the status quo, towards incremental adaptive change and transformational actions for long-term sustainable development. As such, a growing interest in research concerns the understanding of the change required to address ever increasing and unpredictable disaster events. Capturing transformational capacity and resilience, however is not without its difficulties and explains the dearth in attempts to capture this capacity. Adopting a case study approach, this research seeks to enhance an awareness of transformational resilience by identifying key components and indicators that determine the resilience of flood-affected communities within Cumbria. Grounding and testing a theoretical resilience framework within the case studies, permits the identification of how perceptions of risk influence community resilience actions. Further, it assesses how levels of social capital and connectedness impacts upon the extent of interplay between resources and capacities that drive transformational resilience. Thus, this research seeks to expand the existing body of knowledge by enhancing the awareness of resilience in post-disaster affected communities, by investigating indicators of community capacity building and resilience actions that facilitate transformational resilience during the recovery and reconstruction phase of a flood disaster.

Keywords: capacity building, community, flooding, transformational resilience

Procedia PDF Downloads 285
1770 Assessing the Efficiency of Sports Stadiums in India: An Explorative Study of Socio-Economic Sustainability

Authors: Shivam Adhikary

Abstract:

Sports stadiums are not merely public amenities for entertainment and recreation for a city. They are buildings with extremely high construction investment and running costs which holds the supreme responsibility of social integration, nation building and financial upliftment of the community apart from its primary motive of conducting and promotion of the sports. But the present scenario of sports performances at international events and growing physical inactivity among the youth in India show that the sports facilities are far behind in achieving these goals. A pilot study of Indira Gandhi Sports complex in Vijayawada, Andhra Pradesh gave an indication of underutilization of sports stadia in India. This probed a crying need for the assessment of the present usage and functioning of the major sports (non-cricketing) facilities within the country. This paper assesses the sustainability of stadiums built for national and international sporting (non-cricket) events in terms of sporting, socio-cultural and financial sustainability by mainly focusing on their usage in non-event days. The criteria for the assessment and comparison of the stadiums within the nation is done using World Stadium Index and GDI (Gross Domestic Income) while with international counterparts using WSI and GNI (Gross National Income). The pilot case of India Gandhi Sports complex in Vijayawada is further investigated for a deeper understanding of the present usage, the existing issues for its underutilization and the way-forward (at least a few) to reach its sustainable potential. The paper finally concludes with the discussion on whether sports stadiums are being utilized to its financial potential and if it is at par with its international counterparts.

Keywords: economic sustainability, social sustainability, sports infrastructure, stadium efficiency

Procedia PDF Downloads 192
1769 Chance One’s Arm: Critical Evaluation on Laws of Sports Gambling in India

Authors: Archen Sara Vincent

Abstract:

Gambling is the practice or act of betting or wagering on uncertain events with the hope of winning money or any other valuable assets. Nowadays, the practice of gambling can be seen in almost all grounds of events, especially in sports. In sports, this is commonly known among people as sports betting. The history of gambling can be traced about 2,000 years back. It originated from Greeks, from Greeks to the Romans, then to England, where betting on horse races was much popular among the elites. The evolution of gambling in sports has made a greater impact in the modern era. In India, the legality of gambling in sports is regulated by The Public Gambling Act 1867, which prohibits gambling activities in public places. The major draw of this statute is that it does not have specific laws regarding online sports gambling. Section 30 of The Indian Contract Act 1872 considers wagering agreements void. However, there are certain exceptions for this section, that is, (1) state-owned lotteries and (2) wagering on horse races with a sum of Rupees 500 or upward. As per the Indian Constitution, the rules regarding sports gambling are within the powers of the state legislatures. Some of the states have enacted their own laws which explicitly permit or prohibit gambling within their jurisdiction. Recently in Tamilnadu, The Tamilnadu Gaming Act was amended in 2021 to completely ban online gambling and betting. Moreover, the Central Government has introduced the Online Gaming and Prevention of Fraud Bill, 2018, to legalize and regulate sports betting in India. However, this bill has not yet been passed as law. Now as the Indian legal system does not have a specific rule regarding online sports gambling, sports betting companies use this major drawback and attract people to use the gambling and betting apps by advertising with well-known sports players and other celebrities. This paper aims to critically evaluate gambling in sports and the laws relating to it in India.

Keywords: history of gambling, The Public Gambling Act 1862, state legislations, gambling in India

Procedia PDF Downloads 73
1768 Comparing Field Displacement History with Numerical Results to Estimate Geotechnical Parameters: Case Study of Arash-Esfandiar-Niayesh under Passing Tunnel, 2.5 Traffic Lane Tunnel, Tehran, Iran

Authors: A. Golshani, M. Gharizade Varnusefaderani, S. Majidian

Abstract:

Underground structures are of those structures that have uncertainty in design procedures. That is due to the complexity of soil condition around. Under passing tunnels are also such affected structures. Despite geotechnical site investigations, lots of uncertainties exist in soil properties due to unknown events. As results, it possibly causes conflicting settlements in numerical analysis with recorded values in the project. This paper aims to report a case study on a specific under passing tunnel constructed by New Austrian Tunnelling Method in Iran. The intended tunnel has an overburden of about 11.3m, the height of 12.2m and, the width of 14.4m with 2.5 traffic lane. The numerical modeling was developed by a 2D finite element program (PLAXIS Version 8). Comparing displacement histories at the ground surface during the entire installation of initial lining, the estimated surface settlement was about four times the field recorded one, which indicates that some local unknown events affect that value. Also, the displacement ratios were in a big difference between the numerical and field data. Consequently, running several numerical back analyses using laboratory and field tests data, the geotechnical parameters were accurately revised to match with the obtained monitoring data. Finally, it was found that usually the values of soil parameters are conservatively low-estimated up to 40 percent by typical engineering judgment. Additionally, it could be attributed to inappropriate constitutive models applied for the specific soil condition.

Keywords: NATM, surface displacement history, numerical back-analysis, geotechnical parameters

Procedia PDF Downloads 191
1767 Heat Vulnerability Index (HVI) Mapping in Extreme Heat Days Coupled with Air Pollution Using Principal Component Analysis (PCA) Technique: A Case Study of Amiens, France

Authors: Aiman Mazhar Qureshi, Ahmed Rachid

Abstract:

Extreme heat events are emerging human environmental health concerns in dense urban areas due to anthropogenic activities. High spatial and temporal resolution heat maps are important for urban heat adaptation and mitigation, helping to indicate hotspots that are required for the attention of city planners. The Heat Vulnerability Index (HVI) is the important approach used by decision-makers and urban planners to identify heat-vulnerable communities and areas that require heat stress mitigation strategies. Amiens is a medium-sized French city, where the average temperature has been increasing since the year 2000 by +1°C. Extreme heat events are recorded in the month of July for the last three consecutive years, 2018, 2019 and 2020. Poor air quality, especially ground-level ozone, has been observed mainly during the same hot period. In this study, we evaluated the HVI in Amiens during extreme heat days recorded last three years (2018,2019,2020). The Principal Component Analysis (PCA) technique is used for fine-scale vulnerability mapping. The main data we considered for this study to develop the HVI model are (a) socio-economic and demographic data; (b) Air pollution; (c) Land use and cover; (d) Elderly heat-illness; (e) socially vulnerable; (f) Remote sensing data (Land surface temperature (LST), mean elevation, NDVI and NDWI). The output maps identified the hot zones through comprehensive GIS analysis. The resultant map shows that high HVI exists in three typical areas: (1) where the population density is quite high and the vegetation cover is small (2) the artificial surfaces (built-in areas) (3) industrial zones that release thermal energy and ground-level ozone while those with low HVI are located in natural landscapes such as rivers and grasslands. The study also illustrates the system theory with a causal diagram after data analysis where anthropogenic activities and air pollution appear in correspondence with extreme heat events in the city. Our suggested index can be a useful tool to guide urban planners and municipalities, decision-makers and public health professionals in targeting areas at high risk of extreme heat and air pollution for future interventions adaptation and mitigation measures.

Keywords: heat vulnerability index, heat mapping, heat health-illness, remote sensing, urban heat mitigation

Procedia PDF Downloads 142
1766 Inter-Annual Variations of Sea Surface Temperature in the Arabian Sea

Authors: K. S. Sreejith, C. Shaji

Abstract:

Though both Arabian Sea and its counterpart Bay of Bengal is forced primarily by the semi-annually reversing monsoons, the spatio-temporal variations of surface waters is very strong in the Arabian Sea as compared to the Bay of Bengal. This study focuses on the inter-annual variability of Sea Surface Temperature (SST) in the Arabian Sea by analysing ERSST dataset which covers 152 years of SST (January 1854 to December 2002) based on the ICOADS in situ observations. To capture the dominant SST oscillations and to understand the inter-annual SST variations at various local regions of the Arabian Sea, wavelet analysis was performed on this long time-series SST dataset. This tool is advantageous over other signal analysing tools like Fourier analysis, based on the fact that it unfolds a time-series data (signal) both in frequency and time domain. This technique makes it easier to determine dominant modes of variability and explain how those modes vary in time. The analysis revealed that pentadal SST oscillations predominate at most of the analysed local regions in the Arabian Sea. From the time information of wavelet analysis, it was interpreted that these cold and warm events of large amplitude occurred during the periods 1870-1890, 1890-1910, 1930-1950, 1980-1990 and 1990-2005. SST oscillations with peaks having period of ~ 2-4 years was found to be significant in the central and eastern regions of Arabian Sea. This indicates that the inter-annual SST variation in the Indian Ocean is affected by the El Niño-Southern Oscillation (ENSO) and Indian Ocean Dipole (IOD) events.

Keywords: Arabian Sea, ICOADS, inter-annual variation, pentadal oscillation, SST, wavelet analysis

Procedia PDF Downloads 272