Search results for: international standards on juvenile justice
1478 An Exploration of the Place of Buddhism in the Tham Luang Cave Rescue and Its Aftermath
Authors: Hamish de Nett
Abstract:
On 23rd June 2018, twelve young footballers from the Wild Boar Academy and their coach went to explore the Tham Luang cave in the Doi Nang Non mountain range in Chiang Rai Province, Northern Thailand. Whilst they were inside the cave, monsoon rains hit, and the complex became partially flooded. In the following days, Thai Navy SEALs and an international team of expert divers assembled at the cave complex in order to rescue the boys. Although it was only marginally reported in the Western press, Buddhism and ritual activities played a major role in the rescue and its aftermath. This paper utilises numerous news articles and books written by reporters who covered the cave rescue to uncover what the place of Buddhism was in the Tham Luang cave rescue. This paper initially sets out the development of Thai Buddhism and the Thai nation state, paying particular note to the tension in Thai Buddhism between Buddhism as it is popularly practised and normative, state-favoured Buddhism. Secondly, this paper demonstrates that, during the Tham Luang cave rescue, Buddhism helped people cope with the disaster, provided an explanation for its occurrence, and allowed bystanders some efficacy in the process. Thirdly, this paper discusses how Buddhism helped people to give thanks after the rescue, achieve reconciliation, and gain closure. Finally, this paper analyses how the government and the political sphere utilised Buddhism during the rescue. The conclusion reached is that the Buddhism practiced during the Tham Luang cave rescue and its aftermath is representative of the wider tension between popular Buddhism and normative state-favoured Buddhism that is currently present within Thai Buddhism and has been for centuries.Keywords: cave rescue, contemporary Buddhism, lived religion, Thai Buddhism, Tham Luang cave rescue
Procedia PDF Downloads 1321477 Exploration of a Blockchain Assisted Framework for Through Baggage Interlining: Blocklining
Authors: Mary Rose Everan, Michael McCann, Gary Cullen
Abstract:
International travel journeys, by their nature, incorporate elements provided by multiple service providers such as airlines, rail carriers, airports, and ground handlers. Data needs to be stored by and exchanged between these parties in the process of managing the journey. The fragmented nature of this shared management of mutual clients is a limiting factor in the development of a seamless, hassle-free, end-to-end travel experience. Traditional interlining agreements attempt to facilitate many separate aspects of co-operation between service providers, typically between airlines and, to some extent, intermodal travel operators, including schedules, fares, ticketing, through check-in, and baggage handling. These arrangements rely on pre-agreement. The development of Virtual Interlining - that is, interlining facilitated by a third party (often but not always an airport) without formal pre-agreement by the airlines or rail carriers - demonstrates an underlying demand for a better quality end-to-end travel experience. Blockchain solutions are being explored in a number of industries and offer, at first sight, an immutable, single source of truth for this data, avoiding data conflicts and misinterpretation. Combined with Smart Contracts, they seemingly offer a more robust and dynamic platform for multi-stakeholder ventures, and even perhaps the ability to join and leave consortia dynamically. Applying blockchain to the intermodal interlining space – termed Blocklining in this paper - is complex and multi-faceted because of the many aspects of cooperation outlined above. To explore its potential, this paper concentrates on one particular dimension, that of through baggage interlining.Keywords: aviation, baggage, blocklining, intermodal, interlining
Procedia PDF Downloads 1481476 Study of Expatriation as Countermeasure to Citizenship-Based Taxation
Authors: Gabriele Palumbo
Abstract:
This research empirically examines some of the reasons behind the fact that recently the number of people giving up their American citizenship for tax purposes has recently increased drastically. The United States Jurisdiction represents a unicum in the practice of taxing worldwide income not only to residents of the United States but also to U.S. citizens living abroad. The worldwide income taxation also affects people defined as “Accidental Americans” who are unaware that they are U.S. citizens. Those people are considered Americans even though they have not been to the United States. American residents abroad can rely on United States income tax treaties and some national law provisions, such as the exclusion of foreign income and foreign tax credits, which are designed specifically to avoid double taxation. However, this mechanism may prove unsatisfactory for people who have not been linked anymore or individuals who have never had relations with the United States. U.S. citizens who are determined to cut all of the ties between themselves and the United States, especially those that involve tax implications, can renounce their U.S. citizenship with the expatriation procedure. The expatriation process represents the extrema ratio and implicates several steps which must be followed carefully. This paper shows the complexity of the procedure that a U.S. citizen who is resident in a foreign country would have to follow to relinquish U.S. citizenship for tax purposes. The mechanism is intended to discourage people from renounce. Going beyond the question of whether U.S. tax regulation is fair or not, this principle nowadays characterizes a popular topic that many scholars and lawyers are discussing. The outcome provides interesting implications that could induce the Congress to rethink the definition of citizenship for both fiscal and nationality law purposes. Indeed, even if a sort of checks and balances has the task of mitigating the renunciation of U.S. citizenship, more and more U.S. citizens desire to get rid of their citizenship.Keywords: double taxation, expatriation tax, international taxation, relinquishment of United States citizenship
Procedia PDF Downloads 1141475 Calibration of 2D and 3D Optical Measuring Instruments in Industrial Environments at Submillimeter Range
Authors: Alberto Mínguez-Martínez, Jesús de Vicente y Oliva
Abstract:
Modern manufacturing processes have led to the miniaturization of systems and, as a result, parts at the micro-and nanoscale are produced. This trend seems to become increasingly important in the near future. Besides, as a requirement of Industry 4.0, the digitalization of the models of production and processes makes it very important to ensure that the dimensions of newly manufactured parts meet the specifications of the models. Therefore, it is possible to reduce the scrap and the cost of non-conformities, ensuring the stability of the production at the same time. To ensure the quality of manufactured parts, it becomes necessary to carry out traceable measurements at scales lower than one millimeter. Providing adequate traceability to the SI unit of length (the meter) to 2D and 3D measurements at this scale is a problem that does not have a unique solution in industrial environments. Researchers in the field of dimensional metrology all around the world are working on this issue. A solution for industrial environments, even if it is not complete, will enable working with some traceability. At this point, we believe that the study of the surfaces could provide us with a first approximation to a solution. Among the different options proposed in the literature, the areal topography methods may be the most relevant because they could be compared to those measurements performed using Coordinate Measuring Machines (CMM’s). These measuring methods give (x, y, z) coordinates for each point, expressing it in two different ways, either expressing the z coordinate as a function of x, denoting it as z(x), for each Y-axis coordinate, or as a function of the x and y coordinates, denoting it as z (x, y). Between others, optical measuring instruments, mainly microscopes, are extensively used to carry out measurements at scales lower than one millimeter because it is a non-destructive measuring method. In this paper, the authors propose a calibration procedure for the scales of optical measuring instruments, particularizing for a confocal microscope, using material standards easy to find and calibrate in metrology and quality laboratories in industrial environments. Confocal microscopes are measuring instruments capable of filtering the out-of-focus reflected light so that when it reaches the detector, it is possible to take pictures of the part of the surface that is focused. Varying and taking pictures at different Z levels of the focus, a specialized software interpolates between the different planes, and it could reconstruct the surface geometry into a 3D model. As it is easy to deduce, it is necessary to give traceability to each axis. As a complementary result, the roughness Ra parameter will be traced to the reference. Although the solution is designed for a confocal microscope, it may be used for the calibration of other optical measuring instruments by applying minor changes.Keywords: industrial environment, confocal microscope, optical measuring instrument, traceability
Procedia PDF Downloads 1581474 Advanced Techniques in Semiconductor Defect Detection: An Overview of Current Technologies and Future Trends
Authors: Zheng Yuxun
Abstract:
This review critically assesses the advancements and prospective developments in defect detection methodologies within the semiconductor industry, an essential domain that significantly affects the operational efficiency and reliability of electronic components. As semiconductor devices continue to decrease in size and increase in complexity, the precision and efficacy of defect detection strategies become increasingly critical. Tracing the evolution from traditional manual inspections to the adoption of advanced technologies employing automated vision systems, artificial intelligence (AI), and machine learning (ML), the paper highlights the significance of precise defect detection in semiconductor manufacturing by discussing various defect types, such as crystallographic errors, surface anomalies, and chemical impurities, which profoundly influence the functionality and durability of semiconductor devices, underscoring the necessity for their precise identification. The narrative transitions to the technological evolution in defect detection, depicting a shift from rudimentary methods like optical microscopy and basic electronic tests to more sophisticated techniques including electron microscopy, X-ray imaging, and infrared spectroscopy. The incorporation of AI and ML marks a pivotal advancement towards more adaptive, accurate, and expedited defect detection mechanisms. The paper addresses current challenges, particularly the constraints imposed by the diminutive scale of contemporary semiconductor devices, the elevated costs associated with advanced imaging technologies, and the demand for rapid processing that aligns with mass production standards. A critical gap is identified between the capabilities of existing technologies and the industry's requirements, especially concerning scalability and processing velocities. Future research directions are proposed to bridge these gaps, suggesting enhancements in the computational efficiency of AI algorithms, the development of novel materials to improve imaging contrast in defect detection, and the seamless integration of these systems into semiconductor production lines. By offering a synthesis of existing technologies and forecasting upcoming trends, this review aims to foster the dialogue and development of more effective defect detection methods, thereby facilitating the production of more dependable and robust semiconductor devices. This thorough analysis not only elucidates the current technological landscape but also paves the way for forthcoming innovations in semiconductor defect detection.Keywords: semiconductor defect detection, artificial intelligence in semiconductor manufacturing, machine learning applications, technological evolution in defect analysis
Procedia PDF Downloads 541473 Enriching Post-Colonial Discourse: An Appraisal of Doms Pagliawan’s Fire Extinguisher
Authors: Robertgie L. Pianar
Abstract:
Post-colonial theory, post-colonialism, or Poco is a recently established literary theory. Consequently, not many literary works, local and international, have been subjected to its criticism. To help intellectualize local literary texts, in particular, through post-colonial discourse, this qualitative inquiry unfolded. Textual analysis was employed to describe, analyse, and interpret Doms Pagliawan’s Fire Extinguisher, a regional work of literature, grounded on the postcolonial concepts of Edward Said’s Otherness, Homi Bhabha’s Unhomeliness or Paralysis, and Frantz Fanon’s Cultural Resistance. The in-depth reading affirmed that the story contains those postcolonial attributes, revealing the following; (A) the presence of the colonizer, who successfully established colonial control over the colonized, the other, was found; (B) through power superimposition, the colonized character was silenced or paralyzed; and, (C) forms of cultural resistance from the colonized character were shown but no matter how its character avoids ‘postcolonial acts’, the struggle just intensifies, hence inevitable. Pagliawan’s Fire Extinguisher is thus a post-colonial text realizer between two differing cultures, the colonizer and the other. Results of this study may substantiate classroom discussions, both undergraduate and graduate classes, specifically in Philippine and World literature, 21st Century literature, readings in New English literatures, and literary theory and criticism courses, scaffolding learners’ grasp of post-colonialism as a major literary theory drawing classic exemplifications from this regional work.Keywords: cultural resistance, otherness, post-colonialism, textual analysis, unhomeliness/paralysis
Procedia PDF Downloads 2661472 The Impact of Artificial Intelligence on Digital Crime
Authors: Á. L. Bendes
Abstract:
By the end of the second decade of the 21st century, artificial intelligence (AI) has become an unavoidable part of everyday life and has necessarily aroused the interest of researchers in almost every field of science. This is no different in the case of jurisprudence, whose main task is not only to create its own theoretical paradigm related to AI. Perhaps the biggest impact on digital crime is artificial intelligence. In addition, the need to create legal frameworks suitable for the future application of the law has a similar importance. The prognosis according to which AI can reshape the practical application of law and, ultimately, the entire legal life is also of considerable importance. In the past, criminal law was basically created to sanction the criminal acts of a person, so the application of its concepts with original content to AI-related violations is not expected to be sufficient in the future. Taking this into account, it is necessary to rethink the basic elements of criminal law, such as the act and factuality, but also, in connection with criminality barriers and criminal sanctions, several new aspects have appeared that challenge both the criminal law researcher and the legislator. It is recommended to continuously monitor technological changes in the field of criminal law as well since it will be timely to re-create both the legal and scientific frameworks to correctly assess the events related to them, which may require a criminal law response. Artificial intelligence has completely reformed the world of digital crime. New crimes have appeared, which the legal systems of many countries do not or do not adequately regulate. It is considered important to investigate and sanction these digital crimes. The primary goal is prevention, for which we need a comprehensive picture of the intertwining of artificial intelligence and digital crimes. The goal is to explore these problems, present them, and create comprehensive proposals that support legal certainty.Keywords: artificial intelligence, chat forums, defamation, international criminal cooperation, social networking, virtual sites
Procedia PDF Downloads 901471 Study of Open Spaces in Urban Residential Clusters in India
Authors: Renuka G. Oka
Abstract:
From chowks to streets to verandahs to courtyards; residential open spaces are very significantly placed in traditional urban neighborhoods of India. At various levels of intersection, the open spaces with their attributes like juxtaposition with the built fabric, scale, climate sensitivity and response, multi-functionality, etc. reflect and respond to the patterns of human interactions. Also, these spaces tend to be quite well utilized. On the other hand, it is a common specter to see an imbalanced utilization of open spaces in newly/recently planned residential clusters. This is maybe due to lack of activity generators around or wrong locations or excess provisions or improper incorporation of aforementioned design attributes. These casual observations suggest the necessity for a systematic study of current residential open spaces. The exploratory study thus attempts to draw lessons through a structured inspection of residential open spaces to understand the effective environment as revealed through their use patterns. Here, residential open spaces are considered in a wider sense to incorporate all the un-built fabric around. These thus, include both use spaces and access space. For the study, open spaces in ten exemplary housing clusters/societies built during the last ten years across India are studied. A threefold inquiry is attempted in this direction. The first relates to identifying and determining the effects of various physical functions like space organization, size, hierarchy, thermal and optical comfort, etc. on the performance of residential open spaces. The second part sets out to understand socio-cultural variations in values, lifestyle, and beliefs which determine activity choices and behavioral preferences of users for respective residential open spaces. The third inquiry further observes the application of these research findings to the design process to derive meaningful and qualitative design advice. However, the study also emphasizes to develop a suitable framework of analysis and to carve out appropriate methods and approaches to probe into these aspects of the inquiry. Given this emphasis, a considerable portion of the research details out the conceptual framework for the study. This framework is supported by an in-depth search of available literature. The findings are worked out for design solutions which integrate the open space systems with the overall design process for residential clusters. The open spaces in residential areas present great complexities both in terms of their use patterns and determinants of their functional responses. The broad aim of the study is, therefore, to arrive at reconsideration of standards and qualitative parameters used by designers – on the basis of more substantial inquiry into the use patterns of open spaces in residential areas.Keywords: open spaces, physical and social determinants, residential clusters, use patterns
Procedia PDF Downloads 1501470 Chromium (VI) Removal from Aqueous Solutions by Ion Exchange Processing Using Eichrom 1-X4, Lewatit Monoplus M800 and Lewatit A8071 Resins: Batch Ion Exchange Modeling
Authors: Havva Tutar Kahraman, Erol Pehlivan
Abstract:
In recent years, environmental pollution by wastewater rises very critically. Effluents discharged from various industries cause this challenge. Different type of pollutants such as organic compounds, oxyanions, and heavy metal ions create this threat for human bodies and all other living things. However, heavy metals are considered one of the main pollutant groups of wastewater. Therefore, this case creates a great need to apply and enhance the water treatment technologies. Among adopted treatment technologies, adsorption process is one of the methods, which is gaining more and more attention because of its easy operations, the simplicity of design and versatility. Ion exchange process is one of the preferred methods for removal of heavy metal ions from aqueous solutions. It has found widespread application in water remediation technologies, during the past several decades. Therefore, the purpose of this study is to the removal of hexavalent chromium, Cr(VI), from aqueous solutions. Cr(VI) is considered as a well-known highly toxic metal which modifies the DNA transcription process and causes important chromosomic aberrations. The treatment and removal of this heavy metal have received great attention to maintaining its allowed legal standards. The purpose of the present paper is an attempt to investigate some aspects of the use of three anion exchange resins: Eichrom 1-X4, Lewatit Monoplus M800 and Lewatit A8071. Batch adsorption experiments were carried out to evaluate the adsorption capacity of these three commercial resins in the removal of Cr(VI) from aqueous solutions. The chromium solutions used in the experiments were synthetic solutions. The parameters that affect the adsorption, solution pH, adsorbent concentration, contact time, and initial Cr(VI) concentration, were performed at room temperature. High adsorption rates of metal ions for the three resins were reported at the onset, and then plateau values were gradually reached within 60 min. The optimum pH for Cr(VI) adsorption was found as 3.0 for these three resins. The adsorption decreases with the increase in pH for three anion exchangers. The suitability of Freundlich, Langmuir and Scatchard models were investigated for Cr(VI)-resin equilibrium. Results, obtained in this study, demonstrate excellent comparability between three anion exchange resins indicating that Eichrom 1-X4 is more effective and showing highest adsorption capacity for the removal of Cr(VI) ions. Investigated anion exchange resins in this study can be used for the efficient removal of chromium from water and wastewater.Keywords: adsorption, anion exchange resin, chromium, kinetics
Procedia PDF Downloads 2601469 The Taxonomic and Functional Diversity in Edaphic Microbial Communities from Antarctic Dry Valleys
Authors: Sean T. S. Wei, Joy D. Van Nostrand, Annapoorna Maitrayee Ganeshram, Stephen B. Pointing
Abstract:
McMurdo Dry Valleys are a largely ice-free polar desert protected by international treaty as an Antarctic special managed area. The terrestrial landscape is dominated by oligotrophic mineral soil with extensive rocky outcrops. Several environmental stresses: low temperature, lack of liquid water, UV exposure and oligotrophic substrates, restrict the major biotic component to microorganisms. The bacterial diversity and the putative physiological capacity of microbial communities of quartz rocks (hypoliths) and soil of a maritime-influenced Dry Valleys were interrogated by two metagenomic approaches: 454 pyro-sequencing and Geochp DNA microarray. The most abundant phylum in hypoliths was Cyanobacteria (46%), whereas in solils Actinobacteria (31%) were most abundant. The Proteobacteria and Bacteriodetes were the only other phyla to comprise >10% of both communities. Carbon fixation was indicated by photoautotrophic and chemoautotrophic pathways for both hypolith and soil communities. The fungi accounted for polymer carbon transformations, particularly for aromatic compounds. The complete nitrogen cycling was observed in both communities. The fungi in particular displayed pathways related to ammonification. Environmental stress response pathways were common among bacteria, whereas the nutrient stress response pathways were more widely present in bacteria, archaea and fungi. The diversity of bacterialphage was also surveyed by Geochip. Data suggested that different substrates supported different viral families: Leviviridae, Myoviridae, Podoviridae and Siphoviridiae were ubiquitous. However, Corticoviridae and Microviridae only occurred in wetter soils.Keywords: Antarctica, hypolith, soil, dry valleys, geochip, functional diversity, stress response
Procedia PDF Downloads 4521468 Application of Bayesian Model Averaging and Geostatistical Output Perturbation to Generate Calibrated Ensemble Weather Forecast
Authors: Muhammad Luthfi, Sutikno Sutikno, Purhadi Purhadi
Abstract:
Weather forecast has necessarily been improved to provide the communities an accurate and objective prediction as well. To overcome such issue, the numerical-based weather forecast was extensively developed to reduce the subjectivity of forecast. Yet the Numerical Weather Predictions (NWPs) outputs are unfortunately issued without taking dynamical weather behavior and local terrain features into account. Thus, NWPs outputs are not able to accurately forecast the weather quantities, particularly for medium and long range forecast. The aim of this research is to aid and extend the development of ensemble forecast for Meteorology, Climatology, and Geophysics Agency of Indonesia. Ensemble method is an approach combining various deterministic forecast to produce more reliable one. However, such forecast is biased and uncalibrated due to its underdispersive or overdispersive nature. As one of the parametric methods, Bayesian Model Averaging (BMA) generates the calibrated ensemble forecast and constructs predictive PDF for specified period. Such method is able to utilize ensemble of any size but does not take spatial correlation into account. Whereas space dependencies involve the site of interest and nearby site, influenced by dynamic weather behavior. Meanwhile, Geostatistical Output Perturbation (GOP) reckons the spatial correlation to generate future weather quantities, though merely built by a single deterministic forecast, and is able to generate an ensemble of any size as well. This research conducts both BMA and GOP to generate the calibrated ensemble forecast for the daily temperature at few meteorological sites nearby Indonesia international airport.Keywords: Bayesian Model Averaging, ensemble forecast, geostatistical output perturbation, numerical weather prediction, temperature
Procedia PDF Downloads 2831467 Pill-Box Dispenser as a Strategy for Therapeutic Management: A Qualitative Evaluation
Authors: Bruno R. Mendes, Francisco J. Caldeira, Rita S. Luís
Abstract:
Population ageing is directly correlated to an increase in medicine consumption. Beyond the latter and the polymedicated profile of elderly, it is possible to see a need for pharmacotherapeutic monitoring due to cognitive and physical impairment. In this sense, the tracking, organization and administration of medicines become a daily challenge and the pill-box dispenser system a solution. The pill-box dispenser (system) consists in a small compartmentalized container to unit dose organization, which means a container able to correlate the patient’s prescribed dose regimen and the time schedule of intake. In many European countries, this system is part of pharmacist’s role in clinical pharmacy. Despite this simple solution, therapy compliance is only possible if the patient adheres to the system, so it is important to establish a qualitative and quantitative analysis on the perception of the patient on the benefits and risks of the pill-box dispenser as well as the identification of the ideal system. The analysis was conducted through an observational study, based on the application of a standardized questionnaire structured with the numerical scale of Likert (5 levels) and previously validated on the population. The study was performed during a limited period of time and under a randomized sample of 188 participants. The questionnaire consisted of 22 questions: 6 background measures and 16 specific measures. The standards for the final comparative analysis were obtained through the state-of-the-art on the subject. The study carried out using the Likert scale afforded a degree of agreement and discordance between measures (Sample vs. Standard) of 56,25% and 43,75%, respectively. It was concluded that the pill-box dispenser has greater acceptance among a younger population, that was not the initial target of the system. However, this allows us to guarantee a high adherence in the future. Additionally, it was noted that the cost associated with this service is not a limiting factor for its use. The pill-box dispenser system, as currently implemented, demonstrates an important weakness regarding the quality and effectiveness of the medicines, which is not understood by the patient, revealing a significant lack of literacy when it concerns with medicine area. The characteristics of an ideal system remain unchanged, which means that the size, appearance and availability of information in the pill-box continue to be indispensable elements for the compliance with the system. The pill-box dispenser remains unsuitable regarding container size and the type of treatment to which it applies. Despite that, it might be a future standard for clinical pharmacy, allowing a differentiation of the pharmacist role, as well as a wider range of applications to other age groups and treatments.Keywords: clinical pharmacy, medicines, patient safety, pill-box dispenser
Procedia PDF Downloads 1991466 Analysis of the Content of Sugars, Vitamin C, Preservatives, Synthetic Dyes, Sweeteners, Sodium and Potassium and Microbiological Purity in Selected Products Made From Fruit and Vegetables in Small Regional Factories and in Large Food Corporations
Authors: Katarzyna Miśkiewicz, Magdalena Lasoń-Rydel, Małgorzata Krępska, Katarzyna Sieczyńska, Iwona Masłowska-Lipowicz, Katarzyna Ławińska
Abstract:
The aim of the study was to analyse a selection of 12 pasteurised products made from fruit and vegetables, such as fruit juices, fruit drinks, jams, marmalades and jam produced by small regional factories as well as large food corporations. The research was carried out as part of the project "Innovative system of healthy and regional food distribution", funded by the Ministry of Education and Science (Poland), which aims to create an economically and organisationally strong agri-food industry in Poland through effective cooperation between scientific and socio-economic actors. The main activities of the project include support for the creation of new distribution channels for regional food products and their easy access to a wide group of potential customers while maintaining the highest quality standards. One of the key areas of the project is food quality analyses conducted to indicate the competitive advantage of regional products. Presented here are studies on the content of sugars, vitamin C, preservatives, synthetic colours, sweeteners, sodium and potassium, as well as studies on the microbiological purity of selected products made from fruit and vegetables. The composition of products made from fruit and vegetables varies greatly and depends on both the type of raw material and the way it is processed. Of the samples tested, fruit drinks contained the least amount of sugars, and jam and marmalade made by large producers and bought in large chain stores contained the most. However, the low sugar content of some fruit drinks is due to the presence of the sweetener sucralose in their composition. The vitamin C content of the samples varied, being higher in products where it was added during production. All products made in small local factories were free of food additives such as preservatives, sweeteners and synthetic colours, indicating their superiority over products made by large producers. Products made in small local factories were characterised by a relatively high potassium content. The microbiological purity of commercial products was confirmed - no Salmonella spp. were detected, and the number of mesophilic bacteria, moulds, yeasts, and β-glucuronidase-positive E. coli was below the limit of quantification.Keywords: fruit and vegetable products, sugars, food additives, HPLC, ICP-OES
Procedia PDF Downloads 961465 Spatio-Temporal Changes of Rainfall in São Paulo, Brazil (1973-2012): A Gamma Distribution and Cluster Analysis
Authors: Guilherme Henrique Gabriel, Lucí Hidalgo Nunes
Abstract:
An important feature of rainfall regimes is the variability, which is subject to the atmosphere’s general and regional dynamics, geographical position and relief. Despite being inherent to the climate system, it can harshly impact virtually all human activities. In turn, global climate change has the ability to significantly affect smaller-scale rainfall regimes by altering their current variability patterns. In this regard, it is useful to know if regional climates are changing over time and whether it is possible to link these variations to climate change trends observed globally. This study is part of an international project (Metropole-FAPESP, Proc. 2012/51876-0 and Proc. 2015/11035-5) and the objective was to identify and evaluate possible changes in rainfall behavior in the state of São Paulo, southeastern Brazil, using rainfall data from 79 rain gauges for the last forty years. Cluster analysis and gamma distribution parameters were used for evaluating spatial and temporal trends, and the outcomes are presented by means of geographic information systems tools. Results show remarkable changes in rainfall distribution patterns in São Paulo over the years: changes in shape and scale parameters of gamma distribution indicate both an increase in the irregularity of rainfall distribution and the probability of occurrence of extreme events. Additionally, the spatial outcome of cluster analysis along with the gamma distribution parameters suggest that changes occurred simultaneously over the whole area, indicating that they could be related to remote causes beyond the local and regional ones, especially in a current global climate change scenario.Keywords: climate change, cluster analysis, gamma distribution, rainfall
Procedia PDF Downloads 3231464 Energy Transition and Investor-State Disputes: Scientific Knowledge as a Solution to the Burden for Climate Policy-Making
Authors: Marina E. Konstantinidi
Abstract:
It is now well-established that the fight against climate change and its consequences, which are a threat to mankind and to life on the planet Earth, requires that global temperature rise be kept under 1,5°C. It is also well-established that this requires humanity to put an end to the use of fossil fuels in the next decades, at the latest. However, investors in the fossil energy sector have brought or threatened to bring investment arbitration claims against States which put an end to their activity for the purpose of reaching their climate change policies’ objectives. Examples of such claims are provided by the cases of WMH v. Canada, Lone Pine v. Canada, Uniper v. Netherlands and RWE v. Netherlands. Irrespective of the outcome of the arbitration proceedings, the risk of being ordered to pay very substantial damages may have a ‘chilling effect’ on States, meaning that they may hesitate to implement the energy transition measures needed to fight climate change and its consequences. Although mitigation action is a relatively recent phenomenon, knowledge about the negative impact of fossil fuels has existed for a long time ago. In this paper, it is argued that structured documentation of evidence of knowledge about climate change may influence the adjudication of investment treaty claims and, consequently, affect the content of energy transition regulations that will be implemented. For example, as concerns investors, evidence that change in the regulatory framework towards environmental protection could have been predicted would refute the argument concerning legitimate expectations for legislative stability. By reference to relevant case law, it attempted to explore how pre-existing knowledge about climate change can be used in the adjudication of investor-State disputes and resulting from green energy transition policies.Keywords: climate change, energy transition, international investment law, knowledge
Procedia PDF Downloads 1021463 Beyond Juridical Approaches: The Role of Sociological Approach in Promoting Human Rights of Migrants
Authors: Ali Aghahosseini Dehaghani
Abstract:
Every year in this globalized world, thousands of migrants leave their countries hoping to find a better situation of life in other parts of the world. In this regard, many questions, from a human rights point of view, have been raised about how this phenomenon should be managed in the host countries. Although legal approaches such as legislation and litigation are inevitable in the way to respect the human rights of migrants, there is an increasing consensus about the fact that a strict juridical approach is inadequate to protect as well as to prevent violations of migrants’ rights. Indeed, given the multiplicity of factors that affect and shape the application of these rights and considering the fact that law is a social phenomenon, what is needed is an interdisciplinary approach, which combines both juridical approaches and perspectives from other disciplines. In this respect, a sociological approach is important because it shows the social processes through which human rights of migrants have been constructed or violated in particular social situations. Sociologists who study international migration ask the questions such as how many people migrate, who migrates, why people migrate, what happens to them once they arrive in the host country, how migration affects sending and receiving communities, the extent to which migrants help the economy, the effects of migration on crimes, and how migrants change the local communities. This paper is an attempt to show how sociology can promote human rights of migrants. To this end, the article first explores the usefulness and value of an interdisciplinary approach to realize how and to what extent sociology may improve and promote the human rights of migrants in the destination country. It then examines mechanisms which help to reach to a systematic integration of law and sociological discipline to advance migrants’ rights as well as to encourage legal scholars to consider the implications of societal structures in their works.Keywords: human rights, migrants, sociological approach, interdisciplinary study
Procedia PDF Downloads 4561462 Prediction of Covid-19 Cases and Current Situation of Italy and Its Different Regions Using Machine Learning Algorithm
Authors: Shafait Hussain Ali
Abstract:
Since its outbreak in China, the Covid_19 19 disease has been caused by the corona virus SARS N coyote 2. Italy was the first Western country to be severely affected, and the first country to take drastic measures to control the disease. In start of December 2019, the sudden outbreaks of the Coronary Virus Disease was caused by a new Corona 2 virus (SARS-CO2) of acute respiratory syndrome in china city Wuhan. The World Health Organization declared the epidemic a public health emergency of international concern on January 30, 2020,. On February 14, 2020, 49,053 laboratory-confirmed deaths and 1481 deaths have been reported worldwide. The threat of the disease has forced most of the governments to implement various control measures. Therefore it becomes necessary to analyze the Italian data very carefully, in particular to investigates and to find out the present condition and the number of infected persons in the form of positive cases, death, hospitalized or some other features of infected persons will clear in simple form. So used such a model that will clearly shows the real facts and figures and also understandable to every readable person which can get some real benefit after reading it. The model used must includes(total positive cases, current positive cases, hospitalized patients, death, recovered peoples frequency rates ) all features that explains and clear the wide range facts in very simple form and helpful to administration of that country.Keywords: machine learning tools and techniques, rapid miner tool, Naive-Bayes algorithm, predictions
Procedia PDF Downloads 1081461 High-Performance Thin-layer Chromatography (HPTLC) Analysis of Multi-Ingredient Traditional Chinese Medicine Supplement
Authors: Martin Cai, Khadijah B. Hashim, Leng Leo, Edmund F. Tian
Abstract:
Analysis of traditional Chinese medicinal (TCM) supplements has always been a laborious task, particularly in the case of multi‐ingredient formulations. Traditionally, herbal extracts are analysed using one or few markers compounds. In the recent years, however, pharmaceutical companies are introducing health supplements of TCM active ingredients to cater to the needs of consumers in the fast-paced society in this age. As such, new problems arise in the aspects of composition identification as well as quality analysis. In most cases of products or supplements formulated with multiple TCM herbs, the chemical composition, and nature of each raw material differs greatly from the others in the formulation. This results in a requirement for individual analytical processes in order to identify the marker compounds in the various botanicals. Thin-layer Chromatography (TLC) is a simple, cost effective, yet well-regarded method for the analysis of natural products, both as a Pharmacopeia-approved method for identification and authentication of herbs, and a great analytical tool for the discovery of chemical compositions in herbal extracts. Recent technical advances introduced High-Performance TLC (HPTLC) where, with the help of automated equipment and improvements on the chromatographic materials, both the quality and reproducibility are greatly improved, allowing for highly standardised analysis with greater details. Here we report an industrial consultancy project with ONI Global Pte Ltd for the analysis of LAC Liver Protector, a TCM formulation aimed at improving liver health. The aim of this study was to identify 4 key components of the supplement using HPTLC, following protocols derived from Chinese Pharmacopeia standards. By comparing the TLC profiles of the supplement to the extracts of the herbs reported in the label, this project proposes a simple and cost-effective analysis of the presence of the 4 marker compounds in the multi‐ingredient formulation by using 4 different HPTLC methods. With the increasing trend of small and medium-sized enterprises (SMEs) bringing natural products and health supplements into the market, it is crucial that the qualities of both raw materials and end products be well-assured for the protection of consumers. With the technology of HPTLC, science can be incorporated to help SMEs with their quality control, thereby ensuring product quality.Keywords: traditional Chinese medicine supplement, high performance thin layer chromatography, active ingredients, product quality
Procedia PDF Downloads 2811460 Role and Impact of Artificial Intelligence in Sales and Distribution Management
Authors: Kiran Nair, Jincy George, Suhaib Anagreh
Abstract:
Artificial intelligence (AI) in a marketing context is a form of a deterministic tool designed to optimize and enhance marketing tasks, research tools, and techniques. It is on the verge of transforming marketing roles and revolutionize the entire industry. This paper aims to explore the current dissemination of the application of artificial intelligence (AI) in the marketing mix, reviewing the scope and application of AI in various aspects of sales and distribution management. The paper also aims at identifying the areas of the strong impact of AI in factors of sales and distribution management such as distribution channel, purchase automation, customer service, merchandising automation, and shopping experiences. This is a qualitative research paper that aims to examine the impact of AI on sales and distribution management of 30 multinational brands in six different industries, namely: airline; automobile; banking and insurance; education; information technology; retail and telecom. Primary data is collected by means of interviews and questionnaires from a sample of 100 marketing managers that have been selected using convenient sampling method. The data is then analyzed using descriptive statistics, correlation analysis and multiple regression analysis. The study reveals that AI applications are extensively used in sales and distribution management, with a strong impact on various factors such as identifying new distribution channels, automation in merchandising, customer service, and purchase automation as well as sales processes. International brands have already integrated AI extensively in their day-to-day operations for better efficiency and improved market share while others are investing heavily in new AI applications for gaining competitive advantage.Keywords: artificial intelligence, sales and distribution, marketing mix, distribution channel, customer service
Procedia PDF Downloads 1581459 Vapour Liquid Equilibrium Measurement of CO₂ Absorption in Aqueous 2-Aminoethylpiperazine (AEP)
Authors: Anirban Dey, Sukanta Kumar Dash, Bishnupada Mandal
Abstract:
Carbondioxide (CO2) is a major greenhouse gas responsible for global warming and fossil fuel power plants are the main emitting sources. Therefore the capture of CO2 is essential to maintain the emission levels according to the standards. Carbon capture and storage (CCS) is considered as an important option for stabilization of atmospheric greenhouse gases and minimizing global warming effects. There are three approaches towards CCS: Pre combustion capture where carbon is removed from the fuel prior to combustion, Oxy-fuel combustion, where coal is combusted with oxygen instead of air and Post combustion capture where the fossil fuel is combusted to produce energy and CO2 is removed from the flue gases left after the combustion process. Post combustion technology offers some advantage as existing combustion technologies can still be used without adopting major changes on them. A number of separation processes could be utilized part of post –combustion capture technology. These include (a) Physical absorption (b) Chemical absorption (c) Membrane separation (d) Adsorption. Chemical absorption is one of the most extensively used technologies for large scale CO2 capture systems. The industrially important solvents used are primary amines like Monoethanolamine (MEA) and Diglycolamine (DGA), secondary amines like diethanolamine (DEA) and Diisopropanolamine (DIPA) and tertiary amines like methyldiethanolamine (MDEA) and Triethanolamine (TEA). Primary and secondary amines react fast and directly with CO2 to form stable carbamates while Tertiary amines do not react directly with CO2 as in aqueous solution they catalyzes the hydrolysis of CO2 to form a bicarbonate ion and a protonated amine. Concentrated Piperazine (PZ) has been proposed as a better solvent as well as activator for CO2 capture from flue gas with a 10 % energy benefit compared to conventional amines such as MEA. However, the application of concentrated PZ is limited due to its low solubility in water at low temperature and lean CO2 loading. So following the performance of PZ its derivative 2-Aminoethyl piperazine (AEP) which is a cyclic amine can be explored as an activator towards the absorption of CO2. Vapour liquid equilibrium (VLE) in CO2 capture systems is an important factor for the design of separation equipment and gas treating processes. For proper thermodynamic modeling accurate equilibrium data for the solvent system over a wide range of temperatures, pressure and composition is essential. The present work focuses on the determination of VLE data for (AEP + H2O) system at 40 °C for various composition range.Keywords: absorption, aminoethyl piperazine, carbondioxide, vapour liquid equilibrium
Procedia PDF Downloads 2701458 Oral Fluency: A Case Study of L2 Learners in Canada
Authors: Maaly Jarrah
Abstract:
Oral fluency in the target language is what many second language learners hope to achieve by living abroad. Research in the past has demonstrated the role informal environments play in improving L2 learners' oral fluency. However, living in the target country and being part of its community does not ensure the development of oral fluency skills. L2 learners' desire to communicate and access to speaking opportunities in the host community are key in achieving oral fluency in the target language. This study attempts to identify differences in oral fluency, specifically speech rate, between learners who communicate in the L2 outside the classroom and those who do not. In addition, as the desire to communicate is a crucial factor in developing oral fluency, this study investigates whether or not learners' desire to speak the L2 outside the classroom plays a role in their frequency of L2 use outside the classroom. Finally, given the importance of the availability of speaking opportunities for L2 learners in order to practice their speaking skills, this study reports on the participants' perceptions of the speaking opportunities accessible to them in the target community while probing whether or not their perceptions differed based on their oral fluency level and their desire to communicate. The results suggest that exposure to the target language and daily communication with the native speakers is strongly related to the development of learners' oral fluency. Moreover, the findings suggest that learners' desire to communicate affects their frequency of communication in their L2 outside the classroom. At the same time, all participants, regardless of their oral fluency level and their desire to communicate, asserted that speaking opportunities beyond the classroom are very limited. Finally, the study finds there are marked differences in the perceptions learners have regarding opportunities for learning offered by the same language program. After reporting these results, the study concludes with recommendations for ESL programs that serve international students.Keywords: ESL programs, L2 Learners, oral fluency, second language
Procedia PDF Downloads 4801457 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions
Authors: Joel Niklaus, Matthias Sturmer
Abstract:
The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling
Procedia PDF Downloads 1501456 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up
Authors: Okhee Woo
Abstract:
Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose
Procedia PDF Downloads 2001455 Analysis of Ancient and Present Lightning Protection Systems of Large Heritage Stupas in Sri Lanka
Authors: J.R.S.S. Kumara, M.A.R.M. Fernando, S.Venkatesh, D.K. Jayaratne
Abstract:
Protection of heritage monuments against lightning has become extremely important as far as their historical values are concerned. When such structures are large and tall, the risk of lightning initiated from both cloud and ground can be high. This paper presents a lightning risk analysis of three giant stupas in Anuradhapura era (fourth century BC onwards) in Sri Lanka. The three stupas are Jethawaaramaya (269-296 AD), Abayagiriya (88-76 BC) and Ruwanweliseya (161-137 BC), the third, fifth and seventh largest ancient structures in the world. These stupas are solid brick structures consisting of a base, a near hemispherical dome and a conical spire on the top. The ancient stupas constructed with a dielectric crystal on the top and connected to the ground through a conducting material, was considered as the hypothesis for their original lightning protection technique. However, at present, all three stupas are protected with Franklin rod type air termination systems located on top of the spire. First, a risk analysis was carried out according to IEC 62305 by considering the isokeraunic level of the area and the height of the stupas. Then the standard protective angle method and rolling sphere method were used to locate the possible touching points on the surface of the stupas. The study was extended to estimate the critical current which could strike on the unprotected areas of the stupas. The equations proposed by (Uman 2001) and (Cooray2007) were used to find the striking distances. A modified version of rolling sphere method was also applied to see the effects of upward leaders. All these studies were carried out for two scenarios: with original (i.e. ancient) lightning protection system and with present (i.e. new) air termination system. The field distribution on the surface of the stupa in the presence of a downward leader was obtained using finite element based commercial software COMSOL Multiphysics for further investigations of lightning risks. The obtained results were analyzed and compared each other to evaluate the performance of ancient and new lightning protection methods and identify suitable methods to design lightning protection systems for stupas. According to IEC standards, all three stupas with new and ancient lightning protection system has Level IV protection as per protection angle method. However according to rolling sphere method applied with Uman’s equation protection level is III. The same method applied with Cooray’s equation always shows a high risk with respect to Uman’s equation. It was found that there is a risk of lightning strikes on the dome and square chamber of the stupa, and the corresponding critical current values were different with respect to the equations used in the rolling sphere method and modified rolling sphere method.Keywords: Stupa, heritage, lightning protection, rolling sphere method, protection level
Procedia PDF Downloads 2561454 Human Rights to Environment: The Constitutional and Judicial Perspective in India
Authors: Varinder Singh
Abstract:
The primitive man had not known anything like human rights. In the later centuries of human progress with the development of scientific and technological knowledge, the growth of population and the tremendous changes in the human environment, the laws of nature that maintained the Eco-balance crumbled. The race for better and comfortable life landed mankind in a vicious circle. It created environmental imbalance, unplanned and uneven development, breakdown of self-sustaining village economy, mushrooming of shanty towns and slums, widening the chasm between the rich and the poor, over-exploitation of natural resources, desertification of arable lands, pollution of different kinds, heating up of earth and depletion of ozone layer. Modem International Life has been deeply marked and transformed by current endeavors to meet the needs and fulfill the requirements of protection of human person and of the environment. Such endeavors have been encouraged by the widespread recognition that protection of human being and the environment reflects common superior values and constitutes a common concern of mankind. The parallel evolutions of human rights protection and environmental protection disclose some close affinities. There was the occurrence of process of internationalization of both human rights protection and environmental protection, the former beginning with the 1948 Universal Declaration of Human Rights, the latter with the 1972 Stockholm Declaration on the Human Environment.It is now well established that it is the basic human right of every individual to live in a pollution free environment with full human dignity. The judiciary has so far pronounced a number of judgments in this regard. The Supreme Court in view of various laws relating to environment protection and the constitutional provision has held that right to pollution free environment. Article-21 is the heart of the fundamental rights and has received expanded meanings from time to time.Keywords: human rights, law, environment, polluter
Procedia PDF Downloads 2251453 Bank Internal Controls and Credit Risk in Europe: A Quantitative Measurement Approach
Authors: Ellis Kofi Akwaa-Sekyi, Jordi Moreno Gené
Abstract:
Managerial actions which negatively profile banks and impair corporate reputation are addressed through effective internal control systems. Disregard for acceptable standards and procedures for granting credit have affected bank loan portfolios and could be cited for the crises in some European countries. The study intends to determine the effectiveness of internal control systems, investigate whether perceived agency problems exist on the part of board members and to establish the relationship between internal controls and credit risk among listed banks in the European Union. Drawing theoretical support from the behavioural compliance and agency theories, about seventeen internal control variables (drawn from the revised COSO framework), bank-specific, country, stock market and macro-economic variables will be involved in the study. A purely quantitative approach will be employed to model internal control variables covering the control environment, risk management, control activities, information and communication and monitoring. Panel data from 2005-2014 on listed banks from 28 European Union countries will be used for the study. Hypotheses will be tested and the Generalized Least Squares (GLS) regression will be run to establish the relationship between dependent and independent variables. The Hausman test will be used to select whether random or fixed effect model will be used. It is expected that listed banks will have sound internal control systems but their effectiveness cannot be confirmed. A perceived agency problem on the part of the board of directors is expected to be confirmed. The study expects significant effect of internal controls on credit risk. The study will uncover another perspective of internal controls as not only an operational risk issue but credit risk too. Banks will be cautious that observing effective internal control systems is an ethical and socially responsible act since the collapse (crisis) of financial institutions as a result of excessive default is a major contagion. This study deviates from the usual primary data approach to measuring internal control variables and rather models internal control variables in a quantitative approach for the panel data. Thus a grey area in approaching the revised COSO framework for internal controls is opened for further research. Most bank failures and crises could be averted if effective internal control systems are religiously adhered to.Keywords: agency theory, credit risk, internal controls, revised COSO framework
Procedia PDF Downloads 3211452 Development and Validation of Selective Methods for Estimation of Valaciclovir in Pharmaceutical Dosage Form
Authors: Eman M. Morgan, Hayam M. Lotfy, Yasmin M. Fayez, Mohamed Abdelkawy, Engy Shokry
Abstract:
Two simple, selective, economic, safe, accurate, precise and environmentally friendly methods were developed and validated for the quantitative determination of valaciclovir (VAL) in the presence of its related substances R1 (acyclovir), R2 (guanine) in bulk powder and in the commercial pharmaceutical product containing the drug. Method A is a colorimetric method where VAL selectively reacts with ferric hydroxamate and the developed color was measured at 490 nm over a concentration range of 0.4-2 mg/mL with percentage recovery 100.05 ± 0.58 and correlation coefficient 0.9999. Method B is a reversed phase ultra performance liquid chromatographic technique (UPLC) which is considered superior in technology to the high-performance liquid chromatography with respect to speed, resolution, solvent consumption, time, and cost of analysis. Efficient separation was achieved on Agilent Zorbax CN column using ammonium acetate (0.1%) and acetonitrile as a mobile phase in a linear gradient program. Elution time for the separation was less than 5 min and ultraviolet detection was carried out at 256 nm over a concentration range of 2-50 μg/mL with mean percentage recovery 100.11±0.55 and correlation coefficient 0.9999. The proposed methods were fully validated as per International Conference on Harmonization specifications and effectively applied for the analysis of valaciclovir in pure form and tablets dosage form. Statistical comparison of the results obtained by the proposed and official or reported methods revealed no significant difference in the performance of these methods regarding the accuracy and precision respectively.Keywords: hydroxamic acid, related substances, UPLC, valaciclovir
Procedia PDF Downloads 2481451 Effect on the Integrity of the DN300 Pipe and Valves in the Cooling Water System Imposed by the Pipes and Ventilation Pipes above in an Earthquake Situation
Authors: Liang Zhang, Gang Xu, Yue Wang, Chen Li, Shao Chong Zhou
Abstract:
Presently, more and more nuclear power plants are facing the issue of life extension. When a nuclear power plant applies for an extension of life, its condition needs to meet the current design standards, which is not fine for all old reactors, typically for seismic design. Seismic-grade equipment in nuclear power plants are now generally placed separately from the non-seismic-grade equipment, but it was not strictly required before. Therefore, it is very important to study whether non-seismic-grade equipment will affect the seismic-grade equipment when dropped down in an earthquake situation, which is related to the safety of nuclear power plants and future life extension applications. This research was based on the cooling water system with the seismic and non-seismic grade equipment installed together, as an example to study whether the non-seismic-grade equipment such as DN50 fire pipes and ventilation pipes arranged above will damage the DN300 pipes and valves arranged below when earthquakes occur. In the study, the simulation was carried out by ANSYS / LY-DYNA, and Johnson-Cook was used as the material model and failure model. For the experiments, the relative positions of objects in the room were restored by 1: 1. In the experiment, the pipes and valves were filled with water with a pressure of 0.785 MPa. The pressure-holding performance of the pipe was used as a criterion for damage. In addition to the pressure-holding performance, the opening torque was considered as well for the valves. The research results show that when the 10-meter-long DN50 pipe was dropped from the position of 8 meters height and the 8-meter-long air pipe dropped from a position of 3.6 meters height, they do not affect the integrity of DN300 pipe below. There is no failure phenomenon in the simulation as well. After the experiment, the pressure drop in two hours for the pipe is less than 0.1%. The main body of the valve does not fail either. The opening torque change after the experiment is less than 0.5%, but the handwheel of the valve may break, which affects the opening actions. In summary, impacts of the upper pipes and ventilation pipes dropdown on the integrity of the DN300 pipes and valves below in a cooling water system of a typical second-generation nuclear power plant under an earthquake was studied. As a result, the functionality of the DN300 pipeline and the valves themselves are not significantly affected, but the handwheel of the valve or similar articles can probably be broken and need to take care.Keywords: cooling water system, earthquake, integrity, pipe and valve
Procedia PDF Downloads 1131450 Impact of Climate Change on Sea Level Rise along the Coastline of Mumbai City, India
Authors: Chakraborty Sudipta, A. R. Kambekar, Sarma Arnab
Abstract:
Sea-level rise being one of the most important impacts of anthropogenic induced climate change resulting from global warming and melting of icebergs at Arctic and Antarctic, the investigations done by various researchers both on Indian Coast and elsewhere during the last decade has been reviewed in this paper. The paper aims to ascertain the propensity of consistency of different suggested methods to predict the near-accurate future sea level rise along the coast of Mumbai. Case studies at East Coast, Southern Tip and West and South West coast of India have been reviewed. Coastal Vulnerability Index of several important international places has been compared, which matched with Intergovernmental Panel on Climate Change forecasts. The application of Geographic Information System mapping, use of remote sensing technology, both Multi Spectral Scanner and Thematic Mapping data from Landsat classified through Iterative Self-Organizing Data Analysis Technique for arriving at high, moderate and low Coastal Vulnerability Index at various important coastal cities have been observed. Instead of data driven, hindcast based forecast for Significant Wave Height, additional impact of sea level rise has been suggested. Efficacy and limitations of numerical methods vis-à-vis Artificial Neural Network has been assessed, importance of Root Mean Square error on numerical results is mentioned. Comparing between various computerized methods on forecast results obtained from MIKE 21 has been opined to be more reliable than Delft 3D model.Keywords: climate change, Coastal Vulnerability Index, global warming, sea level rise
Procedia PDF Downloads 1331449 Delhi Metro: A Race towards Zero Emission
Authors: Pramit Garg, Vikas Kumar
Abstract:
In December 2015, all the members of the United Nations Framework Convention on Climate Change (UNFCCC) unanimously adopted the historic Paris Agreement. As per the convention, 197 countries have followed the guidelines of the agreement and have agreed to reduce the use of fossil fuels and also reduce the carbon emission to reach net carbon neutrality by 2050 and reduce the global temperature by 2°C by the year 2100. Globally, transport accounts for 23% of the energy-related CO2 that feeds global warming. Decarbonization of the transport sector is an essential step towards achieving India’s nationally determined contributions and net zero emissions by 2050. Metro rail systems are playing a vital role in the decarbonization of the transport sector as they create metro cities for the “21st-century world” that could ensure “mobility, connectivity, productivity, safety and sustainability” for the populace. Metro rail was introduced in Delhi in 2002 to decarbonize Delhi-National Capital Region and to provide a sustainable mode of public transportation. Metro Rail Projects significantly contribute to pollution reduction and are thus a prerequisite for sustainable development. The Delhi Metro is the 1ˢᵗ metro system in the world to earn carbon credits from Clean Development Mechanism (CDM) projects registered under United Nations Framework Convention on Climate Change. A good Metro Project with reasonable network coverage attracts a modal shift from various private modes and hence fewer vehicles on the road, thus restraining the pollution at the source. The absence of Greenhouse Gas emissions from the vehicle of modal shift passengers and lower emissions due to decongested roads contribute to the reduction in Green House Gas emissions and hence overall reduction in atmospheric pollution. The reduction in emission during the horizon year 2002 to 2019 has been estimated using emission standards and deterioration factor(s) for different categories of vehicles. Presently, our results indicate that the Delhi Metro system has reduced approximately 17.3% of motorized trips by road resulting in an emission reduction significantly. Overall, Delhi Metro, with an immediate catchment area of 17% of the National Capital Territory of Delhi (NCTD), is helping today to reduce 387 tonnes of emissions per day and 141.2 ktonnes of emissions yearly. The findings indicate that the Metro rail system is driving cities towards a more livable environment.Keywords: Delhi metro, GHG emission, sustainable public transport, urban transport
Procedia PDF Downloads 129