Search results for: real rational transfer functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10491

Search results for: real rational transfer functions

1401 Detection of Safety Goggles on Humans in Industrial Environment Using Faster-Region Based on Convolutional Neural Network with Rotated Bounding Box

Authors: Ankit Kamboj, Shikha Talwar, Nilesh Powar

Abstract:

To successfully deliver our products in the market, the employees need to be in a safe environment, especially in an industrial and manufacturing environment. The consequences of delinquency in wearing safety glasses while working in industrial plants could be high risk to employees, hence the need to develop a real-time automatic detection system which detects the persons (violators) not wearing safety glasses. In this study a convolutional neural network (CNN) algorithm called faster region based CNN (Faster RCNN) with rotated bounding box has been used for detecting safety glasses on persons; the algorithm has an advantage of detecting safety glasses with different orientation angles on the persons. The proposed method of rotational bounding boxes with a convolutional neural network first detects a person from the images, and then the method detects whether the person is wearing safety glasses or not. The video data is captured at the entrance of restricted zones of the industrial environment (manufacturing plant), which is further converted into images at 2 frames per second. In the first step, the CNN with pre-trained weights on COCO dataset is used for person detection where the detections are cropped as images. Then the safety goggles are labelled on the cropped images using the image labelling tool called roLabelImg, which is used to annotate the ground truth values of rotated objects more accurately, and the annotations obtained are further modified to depict four coordinates of the rectangular bounding box. Next, the faster RCNN with rotated bounding box is used to detect safety goggles, which is then compared with traditional bounding box faster RCNN in terms of detection accuracy (average precision), which shows the effectiveness of the proposed method for detection of rotatory objects. The deep learning benchmarking is done on a Dell workstation with a 16GB Nvidia GPU.

Keywords: CNN, deep learning, faster RCNN, roLabelImg rotated bounding box, safety goggle detection

Procedia PDF Downloads 131
1400 The Politics of Foreign Direct Investment for Socio-Economic Development in Nigeria: An Assessment of the Fourth Republic Strategies (1999 - 2014)

Authors: Muritala Babatunde Hassan

Abstract:

In the contemporary global political economy, foreign direct investment (FDI) is gaining currency on daily basis. Notably, the end of the Cold War has brought about the dominance of neoliberal ideology with its mantra of private-sector-led economy. As such, nation-states now see FDI attraction as an important element in their approach to national development. Governments and policy makers are preoccupying themselves with unraveling the best strategies to not only attract more FDI but also to attain the desired socio-economic development status. In Nigeria, the perceived development potentials of FDI have brought about aggressive hunt for foreign investors, most especially since transition to civilian rule in May 1999. Series of liberal and market oriented strategies are being adopted not only to attract foreign investors but largely to stimulate private sector participation in the economy. It is on this premise that this study interrogates the politics of FDI attraction for domestic development in Nigeria between 1999 and 2014, with the ultimate aim of examining the nexus between regime type and the ability of a state to attract and benefit from FDI. Building its analysis within the framework of institutional utilitarianism, the study posits that the essential FDI strategies for achieving the greatest happiness for the greatest number of Nigerians are political not economic. Both content analysis and descriptive survey methodology were employed in carrying out the study. Content analysis involves desk review of literatures that culminated in the development of the study’s conceptual and theoretical framework of analysis. The study finds no significant relationship between transition to democracy and FDI inflows in Nigeria, as most of the attracted investments during the period of the study were market and resource seeking as was the case during the military regime, thereby contributing minimally to the socio-economic development of the country. It is also found that the country placed much emphasis on liberalization and incentives for FDI attraction at the neglect of improving the domestic investment environment. Consequently, poor state of infrastructure, weak institutional capability and insecurity were identified as the major factors seriously hindering the success of Nigeria in exploiting FDI for domestic development. Given the reality of the currency of FDI as a vector of economic globalization and that Nigeria is trailing the line of private-sector-led approach to development, it is recommended that emphasis should be placed on those measures aimed at improving the infrastructural facilities, building solid institutional framework, enhancing skill and technological transfer and coordinating FDI promotion activities by different agencies and at different levels of government.

Keywords: foreign capital, politics, socio-economic development, FDI attraction strategies

Procedia PDF Downloads 165
1399 Multi Data Management Systems in a Cluster Randomized Trial in Poor Resource Setting: The Pneumococcal Vaccine Schedules Trial

Authors: Abdoullah Nyassi, Golam Sarwar, Sarra Baldeh, Mamadou S. K. Jallow, Bai Lamin Dondeh, Isaac Osei, Grant A. Mackenzie

Abstract:

A randomized controlled trial is the "gold standard" for evaluating the efficacy of an intervention. Large-scale, cluster-randomized trials are expensive and difficult to conduct, though. To guarantee the validity and generalizability of findings, high-quality, dependable, and accurate data management systems are necessary. Robust data management systems are crucial for optimizing and validating the quality, accuracy, and dependability of trial data. Regarding the difficulties of data gathering in clinical trials in low-resource areas, there is a scarcity of literature on this subject, which may raise concerns. Effective data management systems and implementation goals should be part of trial procedures. Publicizing the creative clinical data management techniques used in clinical trials should boost public confidence in the study's conclusions and encourage further replication. In the ongoing pneumococcal vaccine schedule study in rural Gambia, this report details the development and deployment of multi-data management systems and methodologies. We implemented six different data management, synchronization, and reporting systems using Microsoft Access, RedCap, SQL, Visual Basic, Ruby, and ASP.NET. Additionally, data synchronization tools were developed to integrate data from these systems into the central server for reporting systems. Clinician, lab, and field data validation systems and methodologies are the main topics of this report. Our process development efforts across all domains were driven by the complexity of research project data collected in real-time data, online reporting, data synchronization, and ways for cleaning and verifying data. Consequently, we effectively used multi-data management systems, demonstrating the value of creative approaches in enhancing the consistency, accuracy, and reporting of trial data in a poor resource setting.

Keywords: data management, data collection, data cleaning, cluster-randomized trial

Procedia PDF Downloads 29
1398 Investigation of Online Child Sexual Abuse: An Account of Covert Police Operations Across the Globe

Authors: Shivalaxmi Arumugham

Abstract:

Child sexual abuse (CSA) has taken several forms, particularly with the advent of internet technologies that provide pedophiles access to their targets anonymously at an affordable rate. To combat CSA which has far-reaching consequences on the physical and psychological health of the victims, a special act, the Protection of Children from Sexual Offences (POCSO) Act, was formulated amongst the existing laws. With its latest amendment criminalizing various online activities about child pornography also known as child sexual abuse materials in 2019, tremendous pressure is speculated on law enforcement to identify offenders online. Effective investigations of CSA cases help in not only to detect perpetrators but also in preventing the re-victimization of children. Understanding the vulnerability of the child population and that the offenders continue to develop stealthier strategies to operate, it is high time that traditional investigation, where the focus is on apprehending and prosecuting the offender, must make a paradigm shift to proactively investigate to prevent victimization at the first place. One of the proactive policing techniques involves understanding the psychology of the offenders and children and operating undercover to catch the criminals before a real child is victimized. With the fundamental descriptive approach to research, the article attempts to identify the multitude of issues associated with the investigation of child sexual abuse cases currently in practice in India. Then, the article contextualizes the various covert operations carried out by numerous law enforcement agencies across the globe. To provide this comprehensive overview, the paper examines various reports, websites, guidelines, protocols, judicial pronouncements, and research articles. Finally, the paper presents the challenges and ethical issues that are to be considered before getting into undercover operations either in the guise of a pedophile or as a child. The research hopes to contribute to the making of standard operating protocols for investigation officers and other relevant policymakers in this regard.

Keywords: child sexual abuse, cybercrime against children, covert police operations, investigation of CSA

Procedia PDF Downloads 98
1397 Submarine Topography and Beach Survey of Gang-Neung Port in South Korea, Using Multi-Beam Echo Sounder and Shipborne Mobile Light Detection and Ranging System

Authors: Won Hyuck Kim, Chang Hwan Kim, Hyun Wook Kim, Myoung Hoon Lee, Chan Hong Park, Hyeon Yeong Park

Abstract:

We conducted submarine topography & beach survey from December 2015 and January 2016 using multi-beam echo sounder EM3001(Kongsberg corporation) & Shipborne Mobile LiDAR System. Our survey area were the Anmok beach in Gangneung, South Korea. We made Shipborne Mobile LiDAR System for these survey. Shipborne Mobile LiDAR System includes LiDAR (RIEGL LMS-420i), IMU ((Inertial Measurement Unit, MAGUS Inertial+) and RTKGNSS (Real Time Kinematic Global Navigation Satellite System, LEIAC GS 15 GS25) for beach's measurement, LiDAR's motion compensation & precise position. Shipborne Mobile LiDAR System scans beach on the movable vessel using the laser. We mounted Shipborne Mobile LiDAR System on the top of the vessel. Before beach survey, we conducted eight circles IMU calibration survey for stabilizing heading of IMU. This exploration should be as close as possible to the beach. But our vessel could not come closer to the beach because of latency objects in the water. At the same time, we conduct submarine topography survey using multi-beam echo sounder EM3001. A multi-beam echo sounder is a device observing and recording the submarine topography using sound wave. We mounted multi-beam echo sounder on left side of the vessel. We were equipped with a motion sensor, DGNSS (Differential Global Navigation Satellite System), and SV (Sound velocity) sensor for the vessel's motion compensation, vessel's position, and the velocity of sound of seawater. Shipborne Mobile LiDAR System was able to reduce the consuming time of beach survey rather than previous conventional methods of beach survey.

Keywords: Anmok, beach survey, Shipborne Mobile LiDAR System, submarine topography

Procedia PDF Downloads 430
1396 Stabilization of Metastable Skyrmion Phase in Polycrystalline Chiral β-Mn Type Co₇Zn₇Mn₆ Alloy

Authors: Pardeep, Yugandhar Bitla, A. K. Patra, G. A. Basheed

Abstract:

The topological protected nanosized particle-like swirling spin textures, “skyrmion,” has been observed in various ferromagnets with chiral crystal structures like MnSi, FeGe, Cu₂OSeO₃ alloys, however the magnetic ordering in these systems takes place at very low temperatures. For skyrmion-based spintronics devices, the skyrmion phase is required to stabilize in a wide temperature – field (T - H) region. The equilibrium skyrmion phase (SkX) in Co₇Zn₇Mn₆ alloy exists in a narrow T – H region just below transition temperature (TC ~ 215 K) and can be quenched by field cooling as a metastable skyrmion phase (MSkX) below SkX region. To realize robust MSkX at 110 K, field sweep ac susceptibility χ(H) measurements were performed after the zero field cooling (ZFC) and field cooling (FC) process. In ZFC process, the sample was cooled from 320 K to 110 K in zero applied magnetic field and then field sweep measurement was performed (up to 2 T) in positive direction (black curve). The real part of ac susceptibility (χ′(H)) at 110 K in positive field direction after ZFC confirms helical to conical phase transition at low field HC₁ (= 42 mT) and conical to ferromagnetic (FM) transition at higher field HC₂ (= 300 mT). After ZFC, FC measurements were performed i.e., sample was initially cooled in zero fields from 320 to 206 K and then a sample was field cooled in the presence of 15 mT field down to the temperature 110 K. After FC process, isothermal χ(H) was measured in positive (+H, red curve) and negative (-H, blue curve) field direction with increasing and decreasing field upto 2 T. Hysteresis behavior in χ′(H), measured after ZFC and FC process, indicates the stabilization of MSkX at 110 K which is in close agreement with literature. Also, the asymmetry between field-increasing curves measured after FC process in both sides confirm the stabilization of MSkX. In the returning process from the high field polarized FM state, helical state below HC₁ is destroyed and only the conical state is observed. Thus, the robust MSkX state is stabilized below its SkX phase over a much wider T - H region by FC in polycrystalline Co₇Zn₇Mn₆ alloy.

Keywords: skyrmions, magnetic susceptibility, metastable phases, topological phases

Procedia PDF Downloads 107
1395 Implementing the WHO Air Quality Guideline for PM2.5 Worldwide can Prevent Millions of Premature Deaths Per Year

Authors: Despina Giannadaki, Jos Lelieveld, Andrea Pozzer, John Evans

Abstract:

Outdoor air pollution by fine particles ranks among the top ten global health risk factors that can lead to premature mortality. Epidemiological cohort studies, mainly conducted in United States and Europe, have shown that the long-term exposure to PM2.5 (particles with an aerodynamic diameter less than 2.5μm) is associated with increased mortality from cardiovascular, respiratory diseases and lung cancer. Fine particulates can cause health impacts even at very low concentrations. Previously, no concentration level has been defined below which health damage can be fully prevented. The World Health Organization ambient air quality guidelines suggest an annual mean PM2.5 concentration limit of 10μg/m3. Populations in large parts of the world, especially in East and Southeast Asia, and in the Middle East, are exposed to high levels of fine particulate pollution that by far exceeds the World Health Organization guidelines. The aim of this work is to evaluate the implementation of recent air quality standards for PM2.5 in the EU, the US and other countries worldwide and estimate what measures will be needed to substantially reduce premature mortality. We investigated premature mortality attributed to fine particulate matter (PM2.5) under adults ≥ 30yrs and children < 5yrs, applying a high-resolution global atmospheric chemistry model combined with epidemiological concentration-response functions. The latter are based on the methodology of the Global Burden of Disease for 2010, assuming a ‘safe’ annual mean PM2.5 threshold of 7.3μg/m3. We estimate the global premature mortality by PM2.5 at 3.15 million/year in 2010. China is the leading country with about 1.33 million, followed by India with 575 thousand and Pakistan with 105 thousand. For the European Union (EU) we estimate 173 thousand and the United States (US) 52 thousand in 2010. Based on sensitivity calculations we tested the gains from PM2.5 control by applying the air quality guidelines (AQG) and standards of the World Health Organization (WHO), the EU, the US and other countries. To estimate potential reductions in mortality rates we take into consideration the deaths that cannot be avoided after the implementation of PM2.5 upper limits, due to the contribution of natural sources to total PM2.5 and therefore to mortality (mainly airborne desert dust). The annual mean EU limit of 25μg/m3 would reduce global premature mortality by 18%, while within the EU the effect is negligible, indicating that the standard is largely met and that stricter limits are needed. The new US standard of 12μg/m3 would reduce premature mortality by 46% worldwide, 4% in the US and 20% in the EU. Implementing the AQG by the WHO of 10μg/m3 would reduce global premature mortality by 54%, 76% in China and 59% in India. In the EU and US, the mortality would be reduced by 36% and 14%, respectively. Hence, following the WHO guideline will prevent 1.7 million premature deaths per year. Sensitivity calculations indicate that even small changes at the lower PM2.5 standards can have major impacts on global mortality rates.

Keywords: air quality guidelines, outdoor air pollution, particulate matter, premature mortality

Procedia PDF Downloads 310
1394 Application of Host Factors as Biomarker in Early Diagnosis of Pulmonary Tuberculosis

Authors: Ambrish Tiwari, Sudhasini Panda, Archana Singh, Kalpana Luthra, S. K. Sharma

Abstract:

Introduction: On the basis of available literature we know that various host factors play a role in outcome of Tuberculosis (TB) infection by modulating innate immunity. One such factor is Inducible Nitric Oxide Synthase enzyme (iNOS) which help in the production of Nitric Oxide (NO), an antimicrobial agent. Expression of iNOS is in control of various host factors in which Vitamin D along with its nuclear receptor Vitamin D receptor (VDR) is one of them. Vitamin D along with its receptor also produces cathelicidin (antimicrobicidal agent). With this background, we attempted to investigate the levels of Vitamin D and NO along with their associated molecules in tuberculosis patients and household contacts as compared to healthy controls and assess the implication of these findings in susceptibility to tuberculosis (TB). Study subjects and methods: 100 active TB patients, 75 household contacts, and 70 healthy controls were taken. VDR and iNOS mRNA levels were studied using real-time PCR. Serum VDR, cathelicidin, iNOS levels were measured using ELISA. Serum Vitamin D levels were measured in serum samples using chemiluminescence based immunoassay. NO was measured using colorimetry based kit. Results: VDR and iNOS mRNA levels were found to be lower in active TB group compared to household contacts and healthy controls (P=0.0001 and 0.005 respectively). The serum levels of Vitamin D were also found to be lower in active TB group as compared to healthy controls (P =0.001). Levels of cathelicidin and NO was higher in patient group as compared to other groups (p=0.01 and 0.5 respectively). However, the expression of VDR and iNOS and levels of vitamin D was significantly (P < 0.05) higher in household contacts compared to both active TB and healthy control groups. Inference: Higher levels of Vitamin D along with VDR and iNOS expression in household contacts as compared to patients suggest that vitamin D might have a protective role against TB which prevents activation of the disease. From our data, we can conclude that decreased vitamin D levels could be implicated in disease progression and we can use cathelicidin and NO as a biomarker for early diagnosis of pulmonary tuberculosis.

Keywords: vitamin D, VDR, iNOS, tuberculosis

Procedia PDF Downloads 304
1393 Controlling Deforestation in the Densely Populated Region of Central Java Province, Banjarnegara District, Indonesia

Authors: Guntur Bagus Pamungkas

Abstract:

As part of a tropical country that is normally rich in forest land areas, Indonesia has always been in the world's spotlight due to its significantly increasing process of deforestation. In one hand, it is related to the mainstay for maintaining the sustainability of the earth's ecosystem functions. On the other hand, they also cover the various potential sources of the global economy. Therefore, it can always be the target of different scale of investors to excessively exploit them. No wonder the emergence of disasters in various characteristics always comes up. In fact, the deforestation phenomenon does not only occur in various forest land areas in the main islands of Indonesia but also includes Java Island, the most densely populated areas in the world. This island only remains the forest land of about 9.8% of the total forest land in Indonesia due to its long history of it, especially in Central Java Province, the most densely populated area in Java. Again, not surprisingly, this province belongs to the area with the highest frequency of disasters because of it, landslides in particular. One of the areas that often experience it is Banjarnegara District, especially in mountainous areas that lies in the range from 1000 to 3000 meters above sea level, where the remains of land forest area can easyly still be found. Even among them still leaves less untouchable tropical rain forest whose area also covers part of a neighboring district, Pekalongan, which is considered to be the rest of the world's little paradise on Earth. The district's landscape is indeed beautiful, especially in the Dieng area, a major tourist destination in Central Java Province after Borobudur Temple. However, annually hazardous always threatens this district due to this landslide disaster. Even, there was a tragic event that was buried with its inhabitants a few decades ago. This research aims to find part of the concept of effective forest management through monitoring the presence of remaining forest areas in this area. The research implemented monitoring of deforestation rates using the Stochastic Cellular Automata-Markov Chain (SCA-MC) method, which serves to provide a spatial simulation of land use and cover changes (LULCC). This geospatial process uses the Landsat-8 OLI image product with Thermal Infra-Red Sensors (TIRS) Band 10 in 2020 and Landsat 5 TM with TIRS Band 6 in 2010. Then it is also integrated with physical and social geography issues using the QGIS 2.18.11 application with the Mollusce Plugin, which serves to clarify and calculate the area of land use and cover, especially in forest areas—using the LULCC method, which calculates the rate of forest area reduction in 2010-2020 in Banjarnegara District. Since the dependence of this area on the use of forest land is quite high, concepts and preventive actions are needed, such as rehabilitation and reforestation of critical lands through providing proper monitoring and targeted forest management to restore its ecosystem in the future.

Keywords: deforestation, populous area, LULCC method, proper control and effective forest management

Procedia PDF Downloads 136
1392 Developing Scaffolds for Tissue Regeneration using Low Temperature Plasma (LTP)

Authors: Komal Vig

Abstract:

Cardiovascular disease (CVD)-related deaths occur in 17.3 million people globally each year, accounting for 30% of all deaths worldwide, with a predicted annual incidence of deaths to reach 23.3 million globally by 2030. Autologous bypass grafts remain an important therapeutic option for the treatment of CVD, but the poor quality of the donor patient’s blood vessels, the invasiveness of the resection surgery, and postoperative movement restrictions create issues. The present study is aimed to improve the endothelialization of intimal surface of graft by using low temperature plasma (LTP) to increase the cell attachment and proliferation. Polytetrafluoroethylene (PTFE) was treated with LTP. Air was used as the feed-gas, and the pressure in the plasma chamber was kept at 800 mTorr. Scaffolds were also modified with gelatin and collagen by dipping method. Human umbilical vein endothelial cells (HUVEC) were plated on the developed scaffolds, and cell proliferation was determined by the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl tetrazolium bromide (MTT) assay and by microscopy. mRNA expressions levels of different cell markers were investigated using quantitative real-time PCR (qPCR). XPS confirmed the introduction of oxygenated functionalities from LTP. HUVEC cells showed 80% seeding efficiency on the scaffold. Microscopic and MTT assays indicated increase in cell viability in LTP treated scaffolds, especially when treated with gelatin or collagen, compared to untreated scaffolds. Gene expression studies shows enhanced expression of cell adhesion marker Integrin- α 5 gene after LTP treatment. LTP treated scaffolds exhibited better cell proliferation and viability compared to untreated scaffolds. Protein treatment of scaffold increased cell proliferation. Based on our initial results, more scaffolds alternatives will be developed and investigated for cell growth and vascularization studies. Acknowledgments: This work is supported by the NSF EPSCoR RII-Track-1 Cooperative Agreement OIA-2148653.

Keywords: LTP, HUVEC cells, vascular graft, endothelialization

Procedia PDF Downloads 72
1391 Resale Housing Development Board Price Prediction Considering Covid-19 through Sentiment Analysis

Authors: Srinaath Anbu Durai, Wang Zhaoxia

Abstract:

Twitter sentiment has been used as a predictor to predict price values or trends in both the stock market and housing market. The pioneering works in this stream of research drew upon works in behavioural economics to show that sentiment or emotions impact economic decisions. Latest works in this stream focus on the algorithm used as opposed to the data used. A literature review of works in this stream through the lens of data used shows that there is a paucity of work that considers the impact of sentiments caused due to an external factor on either the stock or the housing market. This is despite an abundance of works in behavioural economics that show that sentiment or emotions caused due to an external factor impact economic decisions. To address this gap, this research studies the impact of Twitter sentiment pertaining to the Covid-19 pandemic on resale Housing Development Board (HDB) apartment prices in Singapore. It leverages SNSCRAPE to collect tweets pertaining to Covid-19 for sentiment analysis, lexicon based tools VADER and TextBlob are used for sentiment analysis, Granger Causality is used to examine the relationship between Covid-19 cases and the sentiment score, and neural networks are leveraged as prediction models. Twitter sentiment pertaining to Covid-19 as a predictor of HDB price in Singapore is studied in comparison with the traditional predictors of housing prices i.e., the structural and neighbourhood characteristics. The results indicate that using Twitter sentiment pertaining to Covid19 leads to better prediction than using only the traditional predictors and performs better as a predictor compared to two of the traditional predictors. Hence, Twitter sentiment pertaining to an external factor should be considered as important as traditional predictors. This paper demonstrates the real world economic applications of sentiment analysis of Twitter data.

Keywords: sentiment analysis, Covid-19, housing price prediction, tweets, social media, Singapore HDB, behavioral economics, neural networks

Procedia PDF Downloads 118
1390 Displacement and Cultural Capital in East Harlem: Use of Community Space in Affordable Artist Housing

Authors: Jun Ha Whang

Abstract:

As New York City weathers a swelling 'affordability crisis' marked by rapid transformation in land development and urban culture, much of the associated scholarly debate has turned to questions of the underlying mechanisms of gentrification. Though classically approached from the point of view of urban planning, increasingly these questions have been addressed with an eye to understanding the role of cultural capital in neighborhood valuation. This paper will examine the construction of an artist-specific affordable housing development in the Spanish Harlem neighborhood of Manhattan in order to identify and discuss several cultural parameters of gentrification. This study’s goal is not to argue that the development in question, named Art space PS 109, straightforwardly increases or decreases the rate of gentrification in Spanish Harlem, but rather to study dynamics present in the construction of Art space PS 109 as a case study considered against the broader landscape of gentrification in New York, particularly with respect to the impact of artist communities on housing supply. In the end, what Art space PS 109 most valuably offers us is a reference point for a comparative analysis of affordable housing strategies currently being pursued within municipal government. Our study of Art space PS 109 has allowed us to examine a microcosm of the city’s response and evaluate its overall strategy accordingly. As a base line, the city must aggressively pursue an affordability strategy specifically suited to the needs of each of its neighborhoods. It must also conduct this in such a way so as not to undermine its own efforts by rendering them susceptible to the exploitative involvement of real estate developers seeking to identify successive waves of trendy neighborhoods. Though Art space PS 109 offers an invaluable resource for the city’s legitimate aim of preserving its artist communities, with such a high inclusion rate of artists from outside of the community the project risks additional displacement, strongly suggesting the need for further study of the implications of sites of cultural capital for neighborhood planning.

Keywords: artist housing, displacement, east Harlem, urban planning

Procedia PDF Downloads 163
1389 An Appraisal of Mitigation and Adaptation Measures under Paris Agreement 2015: Developing Nations' Pie

Authors: Olubisi Friday Oluduro

Abstract:

The Paris Agreement 2015, the result of negotiations under the United Nations Framework Convention on Climate Change (UNFCCC), after Kyoto Protocol expiration, sets a long-term goal of limiting the increase in the global average temperature to well below 2 degrees Celsius above pre-industrial levels, and of pursuing efforts to limiting this temperature increase to 1.5 degrees Celsius. An advancement on the erstwhile Kyoto Protocol which sets commitments to only a limited number of Parties to reduce their greenhouse gas (GHGs) emissions, it includes the goal to increase the ability to adapt to the adverse impacts of climate change and to make finance flows consistent with a pathway towards low GHGs emissions. For it achieve these goals, the Agreement requires all Parties to undertake efforts towards reaching global peaking of GHG emissions as soon as possible and towards achieving a balance between anthropogenic emissions by sources and removals by sinks in the second half of the twenty-first century. In addition to climate change mitigation, the Agreement aims at enhancing adaptive capacity, strengthening resilience and reducing the vulnerability to climate change in different parts of the world. It acknowledges the importance of addressing loss and damage associated with the adverse of climate change. The Agreement also contains comprehensive provisions on support to be provided to developing countries, which includes finance, technology transfer and capacity building. To ensure that such supports and actions are transparent, the Agreement contains a number reporting provisions, requiring parties to choose the efforts and measures that mostly suit them (Nationally Determined Contributions), providing for a mechanism of assessing progress and increasing global ambition over time by a regular global stocktake. Despite the somewhat global look of the Agreement, it has been fraught with manifold limitations threatening its very existential capability to produce any meaningful result. Considering these obvious limitations some of which were the very cause of the failure of its predecessor—the Kyoto Protocol—such as the non-participation of the United States, non-payment of funds into the various coffers for appropriate strategic purposes, among others. These have left the developing countries largely threatened eve the more, being more vulnerable than the developed countries, which are really responsible for the climate change scourge. The paper seeks to examine the mitigation and adaptation measures under the Paris Agreement 2015, appraise the present situation since the Agreement was concluded and ascertain whether the developing countries have been better or worse off since the Agreement was concluded, and examine why and how, while projecting a way forward in the present circumstance. It would conclude with recommendations towards ameliorating the situation.

Keywords: mitigation, adaptation, climate change, Paris agreement 2015, framework

Procedia PDF Downloads 158
1388 Trend of Foot and Mouth Disease and Adopted Control Measures in Limpopo Province during the Period 2014 to 2020

Authors: Temosho Promise Chuene, T. Chitura

Abstract:

Background: Foot and mouth disease is a real challenge in South Africa. The disease is a serious threat to the viability of livestock farming initiatives and affects local and international livestock trade. In Limpopo Province, the Kruger National Park and other game reserves are home to the African buffalo (Syncerus caffer), a notorious reservoir of the picornavirus, which causes foot and mouth disease. Out of the virus’s seven (7) distinct serotypes, Southern African Territories (SAT) 1, 2, and 3 are commonly endemic in South Africa. The broad objective of the study was to establish the trend of foot and mouth disease in Limpopo Province over a seven-year period (2014-2020), as well as the adoption and comprehensive reporting of the measures that are taken to contain disease outbreaks in the study area. Methods: The study used secondary data from the World Organization for Animal Health (WOAH) on reported cases of foot and mouth disease in South Africa. Descriptive analysis (frequencies and percentages) and Analysis of variance (ANOVA) were used to present and analyse the data. Result: The year 2020 had the highest prevalence of foot and mouth disease (3.72%), while 2016 had the lowest prevalence (0.05%). Serotype SAT 2 was the most endemic, followed by SAT 1. Findings from the study demonstrated the seasonal nature of foot and mouth disease in the study area, as most disease cases were reported in the summer seasons. Slaughter of diseased and at-risk animals was the only documented disease control strategy, and information was missing for some of the years. Conclusion: The study identified serious underreporting of the adopted control strategies following disease outbreaks. Adoption of comprehensive disease control strategies coupled with thorough reporting can help to reduce outbreaks of foot and mouth disease and prevent losses to the livestock farming sector of South Africa and Limpopo Province in particular.

Keywords: livestock farming, African buffalo, prevalence, serotype, slaughter

Procedia PDF Downloads 66
1387 Mordechai Vanunu: “The Atomic Spy” as a Nuclear Threat to Discourse in Israeli Society

Authors: Ada Yurman

Abstract:

Using the case of Israeli Atomic Spy Mordechai Vanunu as an example, this study sought to examine social response to political deviance whereby social response can be mobilized in order to achieve social control. Mordechai Vanunu, a junior technician in the Dimona Atomic Research Center, played a normative role in the militaristic discourse while working in the “holy shrine” of the Israeli defense system for many years. At a certain stage, however, Vanunu decided to detach himself from this collective and launched an assault on this top-secret circle. Israeli society in general and the security establishment in particular found this attack intolerable and unforgivable. They presented Vanunu as a ticking time bomb, delegitimized him and portrayed him as “other”. In addition, Israeli enforcement authorities imposed myriad prohibitions and sanctions on Vanunu even after his release from prison – “as will be done to he who desecrates holiness.” Social response to Vanunu at the time of his capture and trial was studied by conducting a content analysis of six contemporary daily newspapers. The analysis focused on use of language and forms of expression. In contrast with traditional content analysis methodology, this study did not just look at frequency of expressions of ideas and terms in the text and covert content; rather, the text was analyzed as a structural whole, and included examination of style, tone and unusual use of imagery, and more, in order to uncover hidden messages within the text. The social response to this case was extraordinarily intense, not only because in this case of political deviance, involving espionage and treason, Vanunu’s actions comprised a real potential threat to the country, but also because of the threat his behavior posed to the symbolic universe of society. Therefore, the response to this instance of political deviance can be seen as being part of a mechanism of social control aiming to protect world view of society as a whole, as well as to punish the criminal.

Keywords: militarism, political deviance, social construction, social control

Procedia PDF Downloads 113
1386 The Effect of Lead(II) Lone Electron Pair and Non-Covalent Interactions on the Supramolecular Assembly and Fluorescence Properties of Pb(II)-Pyrrole-2-Carboxylato Polymer

Authors: M. Kowalik, J. Masternak, K. Kazimierczuk, O. V. Khavryuchenko, B. Kupcewicz, B. Barszcz

Abstract:

Recently, the growing interest of chemists in metal-organic coordination polymers (MOCPs) is primarily derived from their intriguing structures and potential applications in catalysis, gas storage, molecular sensing, ion exchanges, nonlinear optics, luminescence, etc. Currently, we are devoting considerable effort to finding the proper method of synthesizing new coordination polymers containing S- or N-heteroaromatic carboxylates as linkers and characterizing the obtained Pb(II) compounds according to their structural diversity, luminescence, and thermal properties. The choice of Pb(II) as the central ion of MOCPs was motivated by several reasons mentioned in the literature: i) a large ionic radius allowing for a wide range of coordination numbers, ii) the stereoactivity of the 6s2 lone electron pair leading to a hemidirected or holodirected geometry, iii) a flexible coordination environment, and iv) the possibility to form secondary bonds and unusual non-covalent interactions, such as classic hydrogen bonds and π···π stacking interactions, as well as nonconventional hydrogen bonds and rarely reported tetrel bonds, Pb(lone pair)···π interactions, C–H···Pb agostic-type interactions or hydrogen bonds, and chelate ring stacking interactions. Moreover, the construction of coordination polymers requires the selection of proper ligands acting as linkers, because we are looking for materials exhibiting different network topologies and fluorescence properties, which point to potential applications. The reaction of Pb(NO₃)₂ with 1H-pyrrole-2-carboxylic acid (2prCOOH) leads to the formation of a new four-nuclear Pb(II) polymer, [Pb4(2prCOO)₈(H₂O)]ₙ, which has been characterized by CHN, FT-IR, TG, PL and single-crystal X-ray diffraction methods. In view of the primary Pb–O bonds, Pb1 and Pb2 show hemidirected pentagonal pyramidal geometries, while Pb2 and Pb4 display hemidirected octahedral geometries. The topology of the strongest Pb–O bonds was determined as the (4·8²) fes topology. Taking the secondary Pb–O bonds into account, the coordination number of Pb centres increased, Pb1 exhibited a hemidirected monocapped pentagonal pyramidal geometry, Pb2 and Pb4 exhibited a holodirected tricapped trigonal prismatic geometry, and Pb3 exhibited a holodirected bicapped trigonal prismatic geometry. Moreover, the Pb(II) lone pair stereoactivity was confirmed by DFT calculations. The 2D structure was expanded into 3D by the existence of non-covalent O/C–H···π and Pb···π interactions, which was confirmed by the Hirshfeld surface analysis. The above mentioned interactions improve the rigidity of the structure and facilitate the charge and energy transfer between metal centres, making the polymer a promising luminescent compound.

Keywords: coordination polymers, fluorescence properties, lead(II), lone electron pair stereoactivity, non-covalent interactions

Procedia PDF Downloads 145
1385 Phytoremediation-A Plant Based Cleansing Method to Obtain Quality Medicinal Plants and Natural Products

Authors: Hannah S. Elizabeth, D. Gnanasekaran, M. R. Manju Gowda, Antony George

Abstract:

Phytoremediation a new technology of remediating the contaminated soil, water and air using plants and serves as a green technology with environmental friendly approach. The main aim of this technique is cleansing and detoxifying of organic compounds, organo-phosphorous pesticides, heavy metals like arsenic, iron, cadmium, gold, radioactive elements which cause teratogenic and life threatening diseases to mankind and animal kingdom when consume the food crops, vegetables, fruits, cerals, and millets obtained from the contaminated soil. Also, directly they may damage the genetic materials thereby alters the biosynthetic pathways of secondary metabolites and other phytoconstituents which may have different pharmacological activities which lead to lost their efficacy and potency as well. It would reflect in mutagenicity, drug resistance and affect other antagonistic properties of normal metabolism. Is the technology for real clean-up of contaminated soils and the contaminants which are potentially toxic. It reduces the risks produced by a contaminated soil by decreasing contaminants using plants as a source. The advantages are cost-effectiveness and less ecosystem disruption. Plants may also help to stabilize contaminants by accumulating and precipitating toxic trace elements in the roots. Organic pollutants can potentially be chemically degraded and ultimately mineralized into harmless biological compounds. Hence, the use of plants to revitalize contaminated sites is gaining more attention and preferred for its cost-effective when compared to other chemical methods. The introduction of harmful substances into the environment has been shown to have many adverse effects on human health, agricultural productivity, and natural ecosystems. Because the costs of growing a crop are minimal compared to those of soil removal and replacement, the use of plants to remediate hazardous soils is seen as having great promise.

Keywords: cost effective, eco-friendly, phytoremediation, secondary metabolites

Procedia PDF Downloads 281
1384 A Comparative Analysis of Conventional and Organic Dairy Supply Chain: Assessing Transport Costs and External Effects in Southern Sweden

Authors: Vivianne Aggestam

Abstract:

Purpose: Organic dairy products have steadily increased with consumer popularity in recent years in Sweden, permitting more transport activities. The main aim of this study was to compare the transport costs and the environmental emissions made by the organic and conventional dairy production in Sweden. The objective was to evaluate differences and environmental impacts of transport between the two different production systems, allowing a more transparent understanding of the real impact of transport within the supply chain. Methods: A partial attributional Life Cycle Assessment has been conducted based on a comprehensive survey of Swedish farmers, dairies and consumers regarding their transport needs and costs. Interviews addressed the farmers and dairies. Consumers were targeted through an online survey. Results: Higher transport inputs from conventional dairy transportation are mainly via feed and soil management on farm level. The regional organic milk brand illustrate less initial transport burdens on farm level, however, after leaving the farm, it had equal or higher transportation requirements. This was mainly due to the location of the dairy farm and shorter product expiry dates, which requires more frequent retail deliveries. Organic consumers tend to use public transport more than private vehicles. Consumers using private vehicles for shopping trips primarily bought conventional products for which price was the main deciding factor. Conclusions: Organic dairy products that emphasise its regional attributes do not ensure less transportation and may therefore not be a more “climate smart” option for the consumer. This suggests that the idea of localism needs to be analysed from a more systemic perspective. Fuel and regional feed efficiency can be further implemented, mainly via fuel type and the types of vehicles used for transport.

Keywords: supply chains, distribution, transportation, organic food productions, conventional food production, agricultural fossil fuel use

Procedia PDF Downloads 455
1383 Ultrasonic Irradiation Synthesis of High-Performance Pd@Copper Nanowires/MultiWalled Carbon Nanotubes-Chitosan Electrocatalyst by Galvanic Replacement toward Ethanol Oxidation in Alkaline Media

Authors: Majid Farsadrouh Rashti, Amir Shafiee Kisomi, Parisa Jahani

Abstract:

The direct ethanol fuel cells (DEFCs) are contemplated as a promising energy source because, In addition to being used in portable electronic devices, it is also used for electric vehicles. The synthesis of bimetallic nanostructures due to their novel optical, catalytic and electronic characteristic which is precisely in contrast to their monometallic counterparts is attracting extensive attention. Galvanic replacement (sometimes is named to as cementation or immersion plating) is an uncomplicated and effective technique for making nanostructures (such as core-shell) of different metals, semiconductors, and their application in DEFCs. The replacement of galvanic does not need any external power supply compared to electrodeposition. In addition, it is different from electroless deposition because there is no need for a reducing agent to replace galvanizing. In this paper, a fast method for the palladium (Pd) wire nanostructures synthesis with the great surface area through galvanic replacement reaction utilizing copper nanowires (CuNWS) as a template by the assistance of ultrasound under room temperature condition is proposed. To evaluate the morphology and composition of Pd@ Copper nanowires/MultiWalled Carbon nanotubes-Chitosan, emission scanning electron microscopy, energy dispersive X-ray spectroscopy were applied. In order to measure the phase structure of the electrocatalysts were performed via room temperature X-ray powder diffraction (XRD) applying an X-ray diffractometer. Various electrochemical techniques including chronoamperometry and cyclic voltammetry were utilized for the electrocatalytic activity of ethanol electrooxidation and durability in basic solution. Pd@ Copper nanowires/MultiWalled Carbon nanotubes-Chitosan catalyst demonstrated substantially enhanced performance and long-term stability for ethanol electrooxidation in the basic solution in comparison to commercial Pd/C that demonstrated the potential in utilizing Pd@ Copper nanowires/MultiWalled Carbon nanotubes-Chitosan as efficient catalysts towards ethanol oxidation. Noticeably, the Pd@ Copper nanowires/MultiWalled Carbon nanotubes-Chitosan presented excellent catalytic activities with a peak current density of 320.73 mAcm² which was 9.5 times more than in comparison to Pd/C (34.2133 mAcm²). Additionally, activation energy thermodynamic and kinetic evaluations revealed that the Pd@ Copper nanowires/MultiWalled Carbon nanotubes-Chitosan catalyst has lower compared to Pd/C which leads to a lower energy barrier and an excellent charge transfer rate towards ethanol oxidation.

Keywords: core-shell structure, electrocatalyst, ethanol oxidation, galvanic replacement reaction

Procedia PDF Downloads 148
1382 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification

Authors: Hung-Sheng Lin, Cheng-Hsuan Li

Abstract:

Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.

Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction

Procedia PDF Downloads 346
1381 Numerical Investigation of Combustion Chamber Geometry on Combustion Performance and Pollutant Emissions in an Ammonia-Diesel Common Rail Dual-Fuel Engine

Authors: Youcef Sehili, Khaled Loubar, Lyes Tarabet, Mahfoudh Cerdoun, Clement Lacroix

Abstract:

As emissions regulations grow more stringent and traditional fuel sources become increasingly scarce, incorporating carbon-free fuels in the transportation sector emerges as a key strategy for mitigating the impact of greenhouse gas emissions. While the utilization of hydrogen (H2) presents significant technological challenges, as evident in the engine limitation known as knocking, ammonia (NH3) provides a viable alternative that overcomes this obstacle and offers convenient transportation, storage, and distribution. Moreover, the implementation of a dual-fuel engine using ammonia as the primary gas is promising, delivering both ecological and economic benefits. However, when employing this combustion mode, the substitution of ammonia at high rates adversely affects combustion performance and leads to elevated emissions of unburnt NH3, especially under high loads, which requires special treatment of this mode of combustion. This study aims to simulate combustion in a common rail direct injection (CRDI) dual-fuel engine, considering the fundamental geometry of the combustion chamber as well as fifteen (15) alternative proposed geometries to determine the configuration that exhibits superior engine performance during high-load conditions. The research presented here focuses on improving the understanding of the equations and mechanisms involved in the combustion of finely atomized jets of liquid fuel and on mastering the CONVERGETM code, which facilitates the simulation of this combustion process. By analyzing the effect of piston bowl shape on the performance and emissions of a diesel engine operating in dual fuel mode, this work combines knowledge of combustion phenomena with proficiency in the calculation code. To select the optimal geometry, an evaluation of the Swirl, Tumble, and Squish flow patterns was conducted for the fifteen (15) studied geometries. Variations in-cylinder pressure, heat release rate, turbulence kinetic energy, turbulence dissipation rate, and emission rates were observed, while thermal efficiency and specific fuel consumption were estimated as functions of crankshaft angle. To maximize thermal efficiency, a synergistic approach involving the enrichment of intake air with oxygen (O2) and the enrichment of primary fuel with hydrogen (H2) was implemented. Based on the results obtained, it is worth noting that the proposed geometry (T8_b8_d0.6/SW_8.0) outperformed the others in terms of flow quality, reduction of pollutants emitted with a reduction of more than 90% in unburnt NH3, and an impressive improvement in engine efficiency of more than 11%.

Keywords: ammonia, hydrogen, combustion, dual-fuel engine, emissions

Procedia PDF Downloads 75
1380 Stress Perception, Social Supports and Family Function among Military Inpatients with Adjustment Disorders in Taiwan

Authors: Huey-Fang Sun, Wei-Kai Weng, Mei-Kuang Chao, Hui-Shan Hsu, Tsai-Yin Shih

Abstract:

Psycho-social stress is important for mental illness and the presence of emotional and behavioral symptoms to an identifiable event is the central feature of adjustment disorders. However, whether patients with adjustment disorders have been raised in family with poor family functions and social supports and have higher stress perception than their peer group when they both experienced a similar stressful environment remains unknown. The specific aims of the study are to investigate the correlation among the family function, social supports and the level of stress perception and to test the hypothesis that military patients with adjustment disorders would have lower family function, lower social supports and higher stress perception than their healthy colleagues recruited in the same cohort for military services given their common exposure to similar stressful environments. Methods: The study was conducted in four hospitals of northern part of Taiwan from July 1, 2015 to June 30, 2017 and a matched case-control study design was used. The inclusion criteria for potential patient participants were psychiatric inpatients that serviced in military during the study period and met the diagnosis of adjustment disorders. Patients who had been admitted to psychiatric ward before or had illiteracy problem were excluded. A healthy military control sample matched by the same military service unit, gender, and recruited cohort was invited to participate the study as well. Totally 74 participants (37 patients and 37 controls) completed the consent forms and filled out the research questionnaires. Questionnaires used in the study included Perceived Stress Scale (PSS) as a measure of stress perception; Family APGAR as a measure of family function, and Multidimensional Scale of Perceived Social Support (MSPSS) as a measure of social supports. Pearson correlation analysis and t-test were applied for statistical analysis. Results: The analysis results showed that PSS level significantly negatively correlated with three social support subscales (family subscale, r= -.37, P < .05; friend subscale, r= -.38, P < .05; significant other subscale, r= -.39, P < .05). A negative correlation between PSS level and Family APGAR only reached a borderline significant level (P= .06). The t-test results for PSS scores, Family APGAR levels, and three subscale scores of MSPSS between patient and control participants were all significantly different (P < .001, P < .05, P < .05, P < .05, P < .05, respectively) and the patient participants had higher stress perception scores, lower social supports and lower family function scores than the healthy control participants. Conclusions: Our study suggested that family function and social supports were negatively correlated with patients’ subjective stress perception. Military patients with adjustment disorders tended to have higher stress perception and lower family function and social supports than those military peers who remained healthy and still provided services in their military units.

Keywords: adjustment disorders, family function, social support, stress perception

Procedia PDF Downloads 194
1379 Investigation of a Single Feedstock Particle during Pyrolysis in Fluidized Bed Reactors via X-Ray Imaging Technique

Authors: Stefano Iannello, Massimiliano Materazzi

Abstract:

Fluidized bed reactor technologies are one of the most valuable pathways for thermochemical conversions of biogenic fuels due to their good operating flexibility. Nevertheless, there are still issues related to the mixing and separation of heterogeneous phases during operation with highly volatile feedstocks, including biomass and waste. At high temperatures, the volatile content of the feedstock is released in the form of the so-called endogenous bubbles, which generally exert a “lift” effect on the particle itself by dragging it up to the bed surface. Such phenomenon leads to high release of volatile matter into the freeboard and limited mass and heat transfer with particles of the bed inventory. The aim of this work is to get a better understanding of the behaviour of a single reacting particle in a hot fluidized bed reactor during the devolatilization stage. The analysis has been undertaken at different fluidization regimes and temperatures to closely mirror the operating conditions of waste-to-energy processes. Beechwood and polypropylene particles were used to resemble the biomass and plastic fractions present in waste materials, respectively. The non-invasive X-ray technique was coupled to particle tracking algorithms to characterize the motion of a single feedstock particle during the devolatilization with high resolution. A high-energy X-ray beam passes through the vessel where absorption occurs, depending on the distribution and amount of solids and fluids along the beam path. A high-speed video camera is synchronised to the beam and provides frame-by-frame imaging of the flow patterns of fluids and solids within the fluidized bed up to 72 fps (frames per second). A comprehensive mathematical model has been developed in order to validate the experimental results. Beech wood and polypropylene particles have shown a very different dynamic behaviour during the pyrolysis stage. When the feedstock is fed from the bottom, the plastic material tends to spend more time within the bed than the biomass. This behaviour can be attributed to the presence of the endogenous bubbles, which drag effect is more pronounced during the devolatilization of biomass, resulting in a lower residence time of the particle within the bed. At the typical operating temperatures of thermochemical conversions, the synthetic polymer softens and melts, and the bed particles attach on its outer surface, generating a wet plastic-sand agglomerate. Consequently, this additional layer of sand may hinder the rapid evolution of volatiles in the form of endogenous bubbles, and therefore the establishment of a poor drag effect acting on the feedstock itself. Information about the mixing and segregation of solid feedstock is of prime importance for the design and development of more efficient industrial-scale operations.

Keywords: fluidized bed, pyrolysis, waste feedstock, X-ray

Procedia PDF Downloads 172
1378 Ganga Rejuvenation through Forestation and Conservation Measures in Riverscape

Authors: Ombir Singh

Abstract:

In spite of the religious and cultural pre-dominance of the river Ganga in the Indian ethos, fragmentation and degradation of the river continued down the ages. Recognizing the national concern on environmental degradation of the river and its basin, Ministry of Water Resources, River Development & Ganga Rejuvenation (MoWR,RD&GR), Government of India has initiated a number of pilot schemes for the rejuvenation of river Ganga under the ‘Namami Gange’ Programme. Considering the diversity, complexity, and intricacies of forest ecosystems and pivotal multiple functions performed by them and their inter-connectedness with highly dynamic river ecosystems, forestry interventions all along the river Ganga from its origin at Gaumukh, Uttarakhand to its mouth at Ganga Sagar, West Bengal has been planned by the ministry. For that Forest Research Institute (FRI) in collaboration with National Mission for Clean Ganga (NMCG) has prepared a Detailed Project Report (DPR) on Forestry Interventions for Ganga. The Institute has adopted an extensive consultative process at the national and state levels involving various stakeholders relevant in the context of river Ganga and employed a science-based methodology including use of remote sensing and GIS technologies for geo-spatial analysis, modeling and prioritization of sites for proposed forestation and conservation interventions. Four sets of field data formats were designed to obtain the field based information for forestry interventions, mainly plantations and conservation measures along the river course. In response, five stakeholder State Forest Departments had submitted more than 8,000 data sheets to the Institute. In order to analyze a voluminous field data received from five participating states, the Institute also developed a software to collate, analyze and generation of reports on proposed sites in Ganga basin. FRI has developed potential plantation and treatment models for the proposed forestry and other conservation measures in major three types of landscape components visualized in the Ganga riverscape. These are: (i) Natural, (ii) Agriculture, and (iii) Urban Landscapes. Suggested plantation models broadly varied for the Uttarakhand Himalayas and the Ganga Plains in five participating states. Besides extensive plantations in three type of landscapes within the riverscape, various conservation measures such as soil and water conservation, riparian wildlife management, wetland management, bioremediation and bio-filtration and supporting activities such as policy and law intervention, concurrent research, monitoring and evaluation, and mass awareness campaigns have been envisioned in the DPR. The DPR also incorporates the details of the implementation mechanism, budget provisioned for different components of the project besides allocation of budget state-wise to five implementing agencies, national partner organizations and the Nodal Ministry.

Keywords: conservation, Ganga, river, water, forestry interventions

Procedia PDF Downloads 149
1377 Data Protection and Regulation Compliance on Handling Physical Child Abuse Scenarios- A Scoping Review

Authors: Ana Mafalda Silva, Rebeca Fontes, Ana Paula Vaz, Carla Carreira, Ana Corte-Real

Abstract:

Decades of research on the topic of interpersonal violence against minors highlight five main conclusions: 1) it causes harmful effects on children's development and health; 2) it is prevalent; 3) it violates children's rights; 4) it can be prevented and 5) parents are the main aggressors. The child abuse scenario is identified through clinical observation, administrative data and self-reports. The most used instruments are self-reports; however, there are no valid and reliable self-report instruments for minors, which consist of a retrospective interpretation of the situation by the victim already in her adult phase and/or by her parents. Clinical observation and collection of information, namely from the orofacial region, are essential in the early identification of these situations. The management of medical data, such as personal data, must comply with the General Data Protection Regulation (GDPR), in Europe, and with the General Law of Data Protection (LGPD), in Brazil. This review aims to answer the question: In a situation of medical assistance to minors, in the suspicion of interpersonal violence, due to mistreatment, is it necessary for the guardians to provide consent in the registration and sharing of personal data, namely medical ones. A scoping review was carried out based on a search by the Web of Science and Pubmed search engines. Four papers and two documents from the grey literature were selected. As found, the process of identifying and signaling child abuse by the health professional, and the necessary early intervention in defense of the minor as a victim of abuse, comply with the guidelines expressed in the GDPR and LGPD. This way, the notification in maltreatment scenarios by health professionals should be a priority and there shouldn’t be the fear or anxiety of legal repercussions that stands in the way of collecting and treating the data necessary for the signaling procedure that safeguards and promotes the welfare of children living with abuse.

Keywords: child abuse, disease notifications, ethics, healthcare assistance

Procedia PDF Downloads 96
1376 Motor Control Recovery Minigame

Authors: Taha Enes Kon, Vanshika Reddy

Abstract:

This project focuses on developing a gamified mobile application to aid in stroke rehabilitation by enhancing motor skills through interactive activities. The primary goal was to design a companion app for a passive haptic rehab glove, incorporating Google MediaPipe for gesture tracking and vibrotactile feedback. The app simulates farming activities, offering a fun and engaging experience while addressing the monotony of traditional rehabilitation methods. The prototype focuses on a single minigame, Flower Picking, which uses gesture recognition to interact with virtual elements, encouraging users to perform exercises that improve hand dexterity. The development process involved creating accessible and user-centered designs using Figma, integrating gesture recognition algorithms, and implementing unity-based game mechanics. Real-time feedback and progressive difficulty levels ensured a personalized experience, motivating users to adhere to rehabilitation routines. The prototype achieved a gesture detection precision of 90%, effectively recognizing predefined gestures such as the Fist and OK symbols. Quantitative analysis highlighted a 40% increase in average session duration compared to traditional exercises, while qualitative feedback praised the app’s immersive design and ease of use. Despite its success, challenges included rigidity in gesture recognition, requiring precise hand orientations, and limited gesture support. Future improvements include expanding gesture adaptability and incorporating additional minigames to target a broader range of exercises. The project demonstrates the potential of gamification in stroke rehabilitation, offering a scalable and accessible solution that complements clinical treatments, making recovery engaging and effective for users.

Keywords: stroke rehabilitation, haptic feedback, gamification, MediaPipe, motor control

Procedia PDF Downloads 7
1375 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 130
1374 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state

Procedia PDF Downloads 266
1373 Impact of Transitioning to Renewable Energy Sources on Key Performance Indicators and Artificial Intelligence Modules of Data Center

Authors: Ahmed Hossam ElMolla, Mohamed Hatem Saleh, Hamza Mostafa, Lara Mamdouh, Yassin Wael

Abstract:

Artificial intelligence (AI) is reshaping industries, and its potential to revolutionize renewable energy and data center operations is immense. By harnessing AI's capabilities, we can optimize energy consumption, predict fluctuations in renewable energy generation, and improve the efficiency of data center infrastructure. This convergence of technologies promises a future where energy is managed more intelligently, sustainably, and cost-effectively. The integration of AI into renewable energy systems unlocks a wealth of opportunities. Machine learning algorithms can analyze vast amounts of data to forecast weather patterns, solar irradiance, and wind speeds, enabling more accurate energy production planning. AI-powered systems can optimize energy storage and grid management, ensuring a stable power supply even during intermittent renewable generation. Moreover, AI can identify maintenance needs for renewable energy infrastructure, preventing costly breakdowns and maximizing system lifespan. Data centers, which consume substantial amounts of energy, are prime candidates for AI-driven optimization. AI can analyze energy consumption patterns, identify inefficiencies, and recommend adjustments to cooling systems, server utilization, and power distribution. Predictive maintenance using AI can prevent equipment failures, reducing energy waste and downtime. Additionally, AI can optimize data placement and retrieval, minimizing energy consumption associated with data transfer. As AI transforms renewable energy and data center operations, modified Key Performance Indicators (KPIs) will emerge. Traditional metrics like energy efficiency and cost-per-megawatt-hour will continue to be relevant, but additional KPIs focused on AI's impact will be essential. These might include AI-driven cost savings, predictive accuracy of energy generation and consumption, and the reduction of carbon emissions attributed to AI-optimized operations. By tracking these KPIs, organizations can measure the success of their AI initiatives and identify areas for improvement. Ultimately, the synergy between AI, renewable energy, and data centers holds the potential to create a more sustainable and resilient future. By embracing these technologies, we can build smarter, greener, and more efficient systems that benefit both the environment and the economy.

Keywords: data center, artificial intelligence, renewable energy, energy efficiency, sustainability, optimization, predictive analytics, energy consumption, energy storage, grid management, data center optimization, key performance indicators, carbon emissions, resiliency

Procedia PDF Downloads 36
1372 GBKMeans: A Genetic Based K-Means Applied to the Capacitated Planning of Reading Units

Authors: Anderson S. Fonseca, Italo F. S. Da Silva, Robert D. A. Santos, Mayara G. Da Silva, Pedro H. C. Vieira, Antonio M. S. Sobrinho, Victor H. B. Lemos, Petterson S. Diniz, Anselmo C. Paiva, Eliana M. G. Monteiro

Abstract:

In Brazil, the National Electric Energy Agency (ANEEL) establishes that electrical energy companies are responsible for measuring and billing their customers. Among these regulations, it’s defined that a company must bill your customers within 27-33 days. If a relocation or a change of period is required, the consumer must be notified in writing, in advance of a billing period. To make it easier to organize a workday’s measurements, these companies create a reading plan. These plans consist of grouping customers into reading groups, which are visited by an employee responsible for measuring consumption and billing. The creation process of a plan efficiently and optimally is a capacitated clustering problem with constraints related to homogeneity and compactness, that is, the employee’s working load and the geographical position of the consuming unit. This process is a work done manually by several experts who have experience in the geographic formation of the region, which takes a large number of days to complete the final planning, and because it’s human activity, there is no guarantee of finding the best optimization for planning. In this paper, the GBKMeans method presents a technique based on K-Means and genetic algorithms for creating a capacitated cluster that respects the constraints established in an efficient and balanced manner, that minimizes the cost of relocating consumer units and the time required for final planning creation. The results obtained by the presented method are compared with the current planning of a real city, showing an improvement of 54.71% in the standard deviation of working load and 11.97% in the compactness of the groups.

Keywords: capacitated clustering, k-means, genetic algorithm, districting problems

Procedia PDF Downloads 199