Search results for: integrated network analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32574

Search results for: integrated network analysis

28854 Survival Analysis after a First Ischaemic Stroke Event: A Case-Control Study in the Adult Population of England.

Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski

Abstract:

Stroke is associated with a significant risk of morbidity and mortality. There is scarcity of research on the long-term survival after first-ever ischaemic stroke (IS) events in England with regards to effects of different medical therapies and comorbidities. The objective of this study was to model the all-cause mortality after an IS diagnosis in the adult population of England. Using a retrospective case-control design, we extracted the electronic medical records of patients born prior to or in year 1960 in England with a first-ever ischaemic stroke diagnosis from January 1986 to January 2017 within the Health and Improvement Network (THIN) database. Participants with a history of ischaemic stroke were matched to 3 controls by sex and age at diagnosis and general practice. The primary outcome was the all-cause mortality. The hazards of the all-cause mortality were estimated using a Weibull-Cox survival model which included both scale and shape effects and a shared random effect of general practice. The model included sex, birth cohort, socio-economic status, comorbidities and medical therapies. 20,250 patients with a history of IS (cases) and 55,519 controls were followed up to 30 years. From 2008 to 2015, the one-year all-cause mortality for the IS patients declined with an absolute change of -0.5%. Preventive treatments to cases increased considerably over time. These included prescriptions of statins and antihypertensives. However, prescriptions for antiplatelet drugs decreased in the routine general practice since 2010. The survival model revealed a survival benefit of antiplatelet treatment to stroke survivors with hazard ratio (HR) of 0.92 (0.90 – 0.94). IS diagnosis had significant interactions with gender and age at entry and hypertension diagnosis. IS diagnosis was associated with high risk of all-cause mortality with HR= 3.39 (3.05-3.72) for cases compared to controls. Hypertension was associated with poor survival with HR = 4.79 (4.49 - 5.09) for hypertensive cases relative to non-hypertensive controls, though the detrimental effect of hypertension has not reached significance for hypertensive controls, HR = 1.19(0.82-1.56). This study of English primary care data showed that between 2008 and 2015, the rates of prescriptions of stroke preventive treatments increased, and a short-term all-cause mortality after IS stroke declined. However, stroke resulted in poor long-term survival. Hypertension, a modifiable risk factor, was found to be associated with poor survival outcomes in IS patients. Antiplatelet drugs were found to be protective to survival. Better efforts are required to reduce the burden of stroke through health service development and primary prevention.

Keywords: general practice, hazard ratio, health improvement network (THIN), ischaemic stroke, multiple imputation, Weibull-Cox model.

Procedia PDF Downloads 180
28853 A Vision-Based Early Warning System to Prevent Elephant-Train Collisions

Authors: Shanaka Gunasekara, Maleen Jayasuriya, Nalin Harischandra, Lilantha Samaranayake, Gamini Dissanayake

Abstract:

One serious facet of the worsening Human-Elephant conflict (HEC) in nations such as Sri Lanka involves elephant-train collisions. Endangered Asian elephants are maimed or killed during such accidents, which also often result in orphaned or disabled elephants, contributing to the phenomenon of lone elephants. These lone elephants are found to be more likely to attack villages and showcase aggressive behaviour, which further exacerbates the overall HEC. Furthermore, Railway Services incur significant financial losses and disruptions to services annually due to such accidents. Most elephant-train collisions occur due to a lack of adequate reaction time. This is due to the significant stopping distance requirements of trains, as the full braking force needs to be avoided to minimise the risk of derailment. Thus, poor driver visibility at sharp turns, nighttime operation, and poor weather conditions are often contributing factors to this problem. Initial investigations also indicate that most collisions occur in localised “hotspots” where elephant pathways/corridors intersect with railway tracks that border grazing land and watering holes. Taking these factors into consideration, this work proposes the leveraging of recent developments in Convolutional Neural Network (CNN) technology to detect elephants using an RGB/infrared capable camera around known hotspots along the railway track. The CNN was trained using a curated dataset of elephants collected on field visits to elephant sanctuaries and wildlife parks in Sri Lanka. With this vision-based detection system at its core, a prototype unit of an early warning system was designed and tested. This weatherised and waterproofed unit consists of a Reolink security camera which provides a wide field of view and range, an Nvidia Jetson Xavier computing unit, a rechargeable battery, and a solar panel for self-sufficient functioning. The prototype unit was designed to be a low-cost, low-power and small footprint device that can be mounted on infrastructures such as poles or trees. If an elephant is detected, an early warning message is communicated to the train driver using the GSM network. A mobile app for this purpose was also designed to ensure that the warning is clearly communicated. A centralized control station manages and communicates all information through the train station network to ensure coordination among important stakeholders. Initial results indicate that detection accuracy is sufficient under varying lighting situations, provided comprehensive training datasets that represent a wide range of challenging conditions are available. The overall hardware prototype was shown to be robust and reliable. We envision a network of such units may help contribute to reducing the problem of elephant-train collisions and has the potential to act as an important surveillance mechanism in dealing with the broader issue of human-elephant conflicts.

Keywords: computer vision, deep learning, human-elephant conflict, wildlife early warning technology

Procedia PDF Downloads 222
28852 A Bibliometric Analysis of Ukrainian Research Articles on SARS-COV-2 (COVID-19) in Compliance with the Standards of Current Research Information Systems

Authors: Sabina Auhunas

Abstract:

These days in Ukraine, Open Science dramatically develops for the sake of scientists of all branches, providing an opportunity to take a more close look on the studies by foreign scientists, as well as to deliver their own scientific data to national and international journals. However, when it comes to the generalization of data on science activities by Ukrainian scientists, these data are often integrated into E-systems that operate inconsistent and barely related information sources. In order to resolve these issues, developed countries productively use E-systems, designed to store and manage research data, such as Current Research Information Systems that enable combining uncompiled data obtained from different sources. An algorithm for selecting SARS-CoV-2 research articles was designed, by means of which we collected the set of papers published by Ukrainian scientists and uploaded by August 1, 2020. Resulting metadata (document type, open access status, citation count, h-index, most cited documents, international research funding, author counts, the bibliographic relationship of journals) were taken from Scopus and Web of Science databases. The study also considered the info from COVID-19/SARS-CoV-2-related documents published from December 2019 to September 2020, directly from documents published by authors depending on territorial affiliation to Ukraine. These databases are enabled to get the necessary information for bibliometric analysis and necessary details: copyright, which may not be available in other databases (e.g., Science Direct). Search criteria and results for each online database were considered according to the WHO classification of the virus and the disease caused by this virus and represented (Table 1). First, we identified 89 research papers that provided us with the final data set after consolidation and removing duplication; however, only 56 papers were used for the analysis. The total number of documents by results from the WoS database came out at 21641 documents (48 affiliated to Ukraine among them) in the Scopus database came out at 32478 documents (41 affiliated to Ukraine among them). According to the publication activity of Ukrainian scientists, the following areas prevailed: Education, educational research (9 documents, 20.58%); Social Sciences, interdisciplinary (6 documents, 11.76%) and Economics (4 documents, 8.82%). The highest publication activity by institution types was reported in the Ministry of Education and Science of Ukraine (its percent of published scientific papers equals 36% or 7 documents), Danylo Halytsky Lviv National Medical University goes next (5 documents, 15%) and P. L. Shupyk National Medical Academy of Postgraduate Education (4 documents, 12%). Basically, research activities by Ukrainian scientists were funded by 5 entities: Belgian Development Cooperation, the National Institutes of Health (NIH, U.S.), The United States Department of Health & Human Services, grant from the Whitney and Betty MacMillan Center for International and Area Studies at Yale, a grant from the Yale Women Faculty Forum. Based on the results of the analysis, we obtained a set of published articles and preprints to be assessed on the variety of features in upcoming studies, including citation count, most cited documents, a bibliographic relationship of journals, reference linking. Further research on the development of the national scientific E-database continues using brand new analytical methods.

Keywords: content analysis, COVID-19, scientometrics, text mining

Procedia PDF Downloads 111
28851 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques

Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo

Abstract:

Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.

Keywords: air pollution, air quality modelling, data mining, particulate matter

Procedia PDF Downloads 256
28850 Assessing Supply Chain Performance through Data Mining Techniques: A Case of Automotive Industry

Authors: Emin Gundogar, Burak Erkayman, Nusret Sazak

Abstract:

Providing effective management performance through the whole supply chain is critical issue and hard to applicate. The proper evaluation of integrated data may conclude with accurate information. Analysing the supply chain data through OLAP (On-Line Analytical Processing) technologies may provide multi-angle view of the work and consolidation. In this study, association rules and classification techniques are applied to measure the supply chain performance metrics of an automotive manufacturer in Turkey. Main criteria and important rules are determined. The comparison of the results of the algorithms is presented.

Keywords: supply chain performance, performance measurement, data mining, automotive

Procedia PDF Downloads 507
28849 Vulnerability Assessment of Reinforced Concrete Frames Based on Inelastic Spectral Displacement

Authors: Chao Xu

Abstract:

Selecting ground motion intensity measures reasonably is one of the very important issues to affect the input ground motions selecting and the reliability of vulnerability analysis results. In this paper, inelastic spectral displacement is used as an alternative intensity measure to characterize the ground motion damage potential. The inelastic spectral displacement is calculated based modal pushover analysis and inelastic spectral displacement based incremental dynamic analysis is developed. Probability seismic demand analysis of a six story and an eleven story RC frame are carried out through cloud analysis and advanced incremental dynamic analysis. The sufficiency and efficiency of inelastic spectral displacement are investigated by means of regression and residual analysis, and compared with elastic spectral displacement. Vulnerability curves are developed based on inelastic spectral displacement. The study shows that inelastic spectral displacement reflects the impact of different frequency components with periods larger than fundamental period on inelastic structural response. The damage potential of ground motion on structures with fundamental period prolonging caused by structural soften can be caught by inelastic spectral displacement. To be compared with elastic spectral displacement, inelastic spectral displacement is a more sufficient and efficient intensity measure, which reduces the uncertainty of vulnerability analysis and the impact of input ground motion selection on vulnerability analysis result.

Keywords: vulnerability, probability seismic demand analysis, ground motion intensity measure, sufficiency, efficiency, inelastic time history analysis

Procedia PDF Downloads 349
28848 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 57
28847 The Fabrication of Stress Sensing Based on Artificial Antibodies to Cortisol by Molecular Imprinted Polymer

Authors: Supannika Klangphukhiew, Roongnapa Srichana, Rina Patramanon

Abstract:

Cortisol has been used as a well-known commercial stress biomarker. A homeostasis response to psychological stress is indicated by an increased level of cortisol produced in hypothalamus-pituitary-adrenal (HPA) axis. Chronic psychological stress contributing to the high level of cortisol relates to several health problems. In this study, the cortisol biosensor was fabricated that mimicked the natural receptors. The artificial antibodies were prepared using molecular imprinted polymer technique that can imitate the performance of natural anti-cortisol antibody with high stability. Cortisol-molecular imprinted polymer (cortisol-MIP) was obtained using the multi-step swelling and polymerization protocol with cortisol as a target molecule combining methacrylic acid:acrylamide (2:1) with bisacryloyl-1,2-dihydroxy-1,2-ethylenediamine and ethylenedioxy-N-methylamphetamine as cross-linkers. Cortisol-MIP was integrated to the sensor. It was coated on the disposable screen-printed carbon electrode (SPCE) for portable electrochemical analysis. The physical properties of Cortisol-MIP were characterized by means of electron microscope techniques. The binding characteristics were evaluated via covalent patterns changing in FTIR spectra which were related to voltammetry response. The performance of cortisol-MIP modified SPCE was investigated in terms of detection range, high selectivity with a detection limit of 1.28 ng/ml. The disposable cortisol biosensor represented an application of MIP technique to recognize steroids according to their structures with feasibility and cost-effectiveness that can be developed to use in point-of-care.

Keywords: stress biomarker, cortisol, molecular imprinted polymer, screen-printed carbon electrode

Procedia PDF Downloads 270
28846 Requirement Analysis for Emergency Management Software

Authors: Tomáš Ludík, Jiří Barta, Sabina Chytilová, Josef Navrátil

Abstract:

Emergency management is a discipline of dealing with and avoiding risks. Appropriate emergency management software allows better management of these risks and has a direct influence on reducing potential negative impacts. Although there are several emergency management software products in the Czech Republic, they cover user requirements from the emergency management field only partially. Therefore, the paper focuses on the issues of requirement analysis within development of emergency management software. Analysis of the current state describes the basic features and properties of user requirements for software development as well as basic methods and approaches for gathering these requirements. Then, the paper presents more specific mechanisms for requirement analysis based on chosen software development approach: structured, object-oriented or agile. Based on these experiences it is designed new methodology for requirement analysis. Methodology describes how to map user requirements comprehensively in the field of emergency management and thus reduce misunderstanding between software analyst and emergency manager. Proposed methodology was consulted with department of fire brigade and also has been applied in the requirements analysis for their current emergency management software. The proposed methodology has general character and can be used also in other specific areas during requirement analysis.

Keywords: emergency software, methodology, requirement analysis, stakeholders, use case diagram, user stories

Procedia PDF Downloads 537
28845 The Effect of Technology- facilitated Lesson Study toward Teacher’s Computer Assisted Language Learning Competencies

Authors: Yi-Ning Chang

Abstract:

With the rapid advancement of technology, it has become crucial for educators to adeptly integrate technology into their teaching and develop a robust Computer-Assisted Language Learning (CALL) competency. Addressing this need, the present study adopted a technology-based Lesson Study approach to assess its impact on the CALL competency and professional capabilities of EFL teachers. Additionally, the study delved into teachers' perceptions of the benefits derived from participating in the creation of technologically integrated lesson plans. The iterative process of technology-based Lesson Study facilitated ample peer discussion, enabling teachers to flexibly design and implement lesson plans that incorporate various technological tools. This 15-week study included 10 in- service teachers from a university of science and technology in the central of Taiwan. The collected data included pre- and post- lesson planning scores, pre- and post- TPACK survey scores, classroom observation forms, designed lesson plans, and reflective essays. The pre- and post- lesson planning and TPACK survey scores were analyzed employing a pair-sampled t test; students’ reflective essays were respectively analyzed applying content analysis. The findings revealed that the teachers’ lesson planning ability and CALL competencies were improved. Teachers perceived a better understanding of integrating technology with teaching subjects, more effective teaching skills, and a deeper understanding of technology. Pedagogical implications and future studies are also discussed.

Keywords: CALL, language learning, lesson study, lesson plan

Procedia PDF Downloads 34
28844 Cinematic Transgression and Sexuality: A Study of Rituparno Ghosh's ‘Queer Trilogy’

Authors: Sudipta Garai

Abstract:

Films as a cultural, social practice remains a dominant space for creation and destruction of ideologies and practices which make the sociological viewing, analysis, and interpretation of the same a complex affair. It remains the doorway between the interpretations and understanding of the writer/director and the reader/viewer. India, being a multi-linguistic culture, the media plays a much intriguing role than that of newspaper, books, stories, novels or any other medium of expression. Known to be the largest democracy, the State seem to guarantee and safeguard people’s choices and life of dignity through its Fundamental Rights and Directives. However, the laws contradict themselves when it comes to IPC 377 criminalizing anything except penovaginal sexual intercourse restricting alternative sexual preferences and practices questioning its sense of ‘democracy.' In this context, the issue of homosexuality came up in bits and pieces through various representations in ‘popular’ cinema mostly with sudden references of mockery and laughter where the explicit narratives of ‘queer’ seemed missing. Rituparno Ghosh, an eminent film maker of Bengal, came up as the ‘queer’ face in Kolkata specifically through his ‘queer’ trilogy (Memories in March, 2010; Arekti Premer Golpo, 2010; Chitrangada: A Crowning Wish, 2012) coming out of his own closet and speaking about his own sexual choices not only through the explicit narratives in films but also in person which made these films an important point of departure in Bengali film history. A sociological reading of these films through a discourse analysis is being done with the critical questions of ‘choice,' ’freedom,' ‘love and marriage’ and most importantly the ‘change.' This study not only focuses on the films and its analysis of content but also to engage with its audience, queer and not in order to extend beyond the art form into the actual vulnerabilities of life and experiences through informal interviews, focused group discussions and engaging with the real life narratives. A research of this kind is always looked upon as a medium of change hoping for a better world wiping away the discrimination and ‘shame’ the ‘queer’ faces in their everyday life, but a social science research is limited but its ‘time’ and academic boundary where the hope of change might be initiated but not fulfilled. The experiences and reflections of the ‘queer’ not only redefined the narratives of the films but also me as a researcher. The perspectives of the ‘hetero-normative’ informants gave a broader picture of the study and the socio-cultural complications that are intrigued with the ideas of resistance and change. The issues on subjectivity, power, and position couldn’t be wiped out in a study of this kind as both politics and aesthetics become integrated with each other in the creation of any art form be it films or a study of research.

Keywords: cinema, alternative sexualities, narratives, sexual choices, state and society

Procedia PDF Downloads 371
28843 Functional Poly(Hedral Oligomeric Silsesquioxane) Nano-Spacer to Boost Quantum Resistive Vapour Sensors’ Sensitivity and Selectivity

Authors: Jean-Francois Feller

Abstract:

The analysis of the volatolome emitted by the human body with a sensor array (e-nose) is a method for clinical applications full of promises to make an olfactive fingerprint characteristic of people's health state. But the amount of volatile organic compounds (VOC) to detect, being in the range of parts per billion (ppb), and their diversity (several hundred) justifies developing ever more sensitive and selective vapor sensors to improve the discrimination ability of the e-nose, is still of interest. Quantum resistive vapour sensors (vQRS) made with nanostructured conductive polymer nanocomposite transducers have shown a great versatility in both their fabrication and operation to detect volatiles of interest such as cancer biomarkers. However, it has been shown that their chemo-resistive response was highly dependent on the quality of the inter-particular junctions in the percolated architecture. The present work investigates the effectiveness of poly(hedral oligomeric silsesquioxane) acting as a nanospacer to amplify the disconnectability of the conducting network and thus maximize the vQRS's sensitivity to VOC.

Keywords: volatolome, quantum resistive vapour sensor, nanostructured conductive polymer nanocomposites, olfactive diagnosis

Procedia PDF Downloads 12
28842 Potentiality of the Wind Energy in Algeria

Authors: C. Benoudjafer, M. N. Tandjaoui, C. Benachaiba

Abstract:

The use of kinetic energy of the wind is in full rise in the world and it starts to be known in our country but timidly. One or more aero generators can be installed to produce for example electricity on isolated places or not connected to the electrical supply network. To use the wind as energy source, it is necessary to know first the energy needs for the population and study the wind intensity, speed, frequency and direction.

Keywords: Algeria, renewable energies, wind, wind power, aero-generators, wind energetic potential

Procedia PDF Downloads 424
28841 Improving the Penalty-free Multi-objective Evolutionary Design Optimization of Water Distribution Systems

Authors: Emily Kambalame

Abstract:

Water distribution networks necessitate many investments for construction, prompting researchers to seek cost reduction and efficient design solutions. Optimization techniques are employed in this regard to address these challenges. In this context, the penalty-free multi-objective evolutionary algorithm (PFMOEA) coupled with pressure-dependent analysis (PDA) was utilized to develop a multi-objective evolutionary search for the optimization of water distribution systems (WDSs). The aim of this research was to find out if the computational efficiency of the PFMOEA for WDS optimization could be enhanced. This was done by applying real coding representation and retaining different percentages of feasible and infeasible solutions close to the Pareto front in the elitism step of the optimization. Two benchmark network problems, namely the Two-looped and Hanoi networks, were utilized in the study. A comparative analysis was then conducted to assess the performance of the real-coded PFMOEA in relation to other approaches described in the literature. The algorithm demonstrated competitive performance for the two benchmark networks by implementing real coding. The real-coded PFMOEA achieved the novel best-known solutions ($419,000 and $6.081 million) and a zero-pressure deficit for the two networks, requiring fewer function evaluations than the binary-coded PFMOEA. In previous PFMOEA studies, elitism applied a default retention of 30% of the least cost-feasible solutions while excluding all infeasible solutions. It was found in this study that by replacing 10% and 15% of the feasible solutions with infeasible ones that are close to the Pareto front with minimal pressure deficit violations, the computational efficiency of the PFMOEA was significantly enhanced. The configuration of 15% feasible and 15% infeasible solutions outperformed other retention allocations by identifying the optimal solution with the fewest function evaluation

Keywords: design optimization, multi-objective evolutionary, penalty-free, water distribution systems

Procedia PDF Downloads 57
28840 Quantification of E-Waste: A Case Study in Federal University of Espírito Santo, Brazil

Authors: Andressa S. T. Gomes, Luiza A. Souza, Luciana H. Yamane, Renato R. Siman

Abstract:

The segregation of waste of electrical and electronic equipment (WEEE) in the generating source, its characterization (quali-quantitative) and identification of origin, besides being integral parts of classification reports, are crucial steps to the success of its integrated management. The aim of this paper was to count WEEE generation at the Federal University of Espírito Santo (UFES), Brazil, as well as to define sources, temporary storage sites, main transportations routes and destinations, the most generated WEEE and its recycling potential. Quantification of WEEE generated at the University in the years between 2010 and 2015 was performed using data analysis provided by UFES’s sector of assets management. EEE and WEEE flow in the campuses information were obtained through questionnaires applied to the University workers. It was recorded 6028 WEEEs units of data processing equipment disposed by the university between 2010 and 2015. Among these waste, the most generated were CRT screens, desktops, keyboards and printers. Furthermore, it was observed that these WEEEs are temporarily stored in inappropriate places at the University campuses. In general, these WEEE units are donated to NGOs of the city, or sold through auctions (2010 and 2013). As for recycling potential, from the primary processing and further sale of printed circuit boards (PCB) from the computers, the amount collected could reach U$ 27,839.23. The results highlight the importance of a WEEE management policy at the University.

Keywords: solid waste, waste of electrical and electronic equipment, waste management, institutional solid waste generation

Procedia PDF Downloads 256
28839 Six Years Antimicrobial Resistance Trends among Bacterial Isolates in Amhara National Regional State, Ethiopia

Authors: Asrat Agalu Abejew

Abstract:

Background: Antimicrobial resistance (AMR) is a silent tsunami and one of the top global threats to health care and public health. It is one of the common agendas globally and in Ethiopia. Emerging AMR will be a double burden to Ethiopia, which is facing a series of problems from infectious disease morbidity and mortality. In Ethiopia, although there are attempts to document AMR in healthcare institutions, comprehensive and all-inclusive analysis is still lacking. Thus, this study is aimed to determine trends in AMR from 2016-2021. Methods: A retrospective analysis of secondary data recorded in the Amhara Public Health Institute (APHI) from 2016 to 2021 G.C was conducted. Blood, Urine, Stool, Swabs, Discharge, body effusions, and other Microbiological specimens were collected from each study participants, and Bacteria identification and Resistance tests were done using the standard microbiologic procedure. Data was extracted from excel in August 2022, Trends in AMR were analyzed, and the results were described. In addition, the chi-square (X2) test and binary logistic regression were used, and a P. value < 0.05 was used to determine a significant association. Results: During 6 years period, there were 25143 culture and susceptibility tests. Overall, 265 (46.2%) bacteria were resistant to 2-4 antibiotics, 253 (44.2%) to 5-7 antibiotics, and 56 (9.7%) to >=8 antibiotics. The gram-negative bacteria were 166 (43.9%), 155 (41.5%), and 55 (14.6%) resistant to 2-4, 5-7, and ≥8 antibiotics, respectively, whereas 99(50.8%), 96(49.2% and 1 (0.5%) of gram-positive bacteria were resistant to 2-4, 5-7 and ≥8 antibiotics respectively. K. pneumonia 3783 (15.67%) and E. coli 3199 (13.25%) were the most commonly isolated bacteria, and the overall prevalence of AMR was 2605 (59.9%), where K. pneumonia 743 (80.24%), E. cloacae 196 (74.81%), A. baumannii 213 (66.56%) being the most common resistant bacteria for antibiotics tested. Except for a slight decline during 2020 (6469 (25.4%)), the overall trend of AMR is rising from year to year, with a peak in 2019 (8480 (33.7%)) and in 2021 (7508 (29.9%). If left un-intervened, the trend in AMR will increase by 78% of variation from the study period, as explained by the differences in years (R2=0.7799). Ampicillin, Augmentin, ciprofloxacin, cotrimoxazole, tetracycline, and Tobramycin were almost resistant to common bacteria they were tested. Conclusion: AMR is linearly increasing during the last 6 years. If left as it is without appropriate intervention after 15 years (2030 E.C), AMR will increase by 338.7%. A growing number of multi-drug resistant bacteria is an alarm to awake policymakers and those who do have the concern to intervene before it is too late. This calls for a periodic, integrated, and continuous system to determine the prevalence of AMR in commonly used antibiotics.

Keywords: AMR, trend, pattern, MDR

Procedia PDF Downloads 73
28838 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups

Authors: Érica Lima

Abstract:

In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.

Keywords: Facebook, social representation, translation, translator

Procedia PDF Downloads 146
28837 An Architectural Model for APT Detection

Authors: Nam-Uk Kim, Sung-Hwan Kim, Tai-Myoung Chung

Abstract:

Typical security management systems are not suitable for detecting APT attack, because they cannot draw the big picture from trivial events of security solutions. Although SIEM solutions have security analysis engine for that, their security analysis mechanisms need to be verified in academic field. Although this paper proposes merely an architectural model for APT detection, we will keep studying on correlation analysis mechanism in the future.

Keywords: advanced persistent threat, anomaly detection, data mining

Procedia PDF Downloads 523
28836 Ophthalmic Hashing Based Supervision of Glaucoma and Corneal Disorders Imposed on Deep Graphical Model

Authors: P. S. Jagadeesh Kumar, Yang Yung, Mingmin Pan, Xianpei Li, Wenli Hu

Abstract:

Glaucoma is impelled by optic nerve mutilation habitually represented as cupping and visual field injury frequently with an arcuate pattern of mid-peripheral loss, subordinate to retinal ganglion cell damage and death. Glaucoma is the second foremost cause of blindness and the chief cause of permanent blindness worldwide. Consequently, all-embracing study into the analysis and empathy of glaucoma is happening to escort deep learning based neural network intrusions to deliberate this substantial optic neuropathy. This paper advances an ophthalmic hashing based supervision of glaucoma and corneal disorders preeminent on deep graphical model. Ophthalmic hashing is a newly proposed method extending the efficacy of visual hash-coding to predict glaucoma corneal disorder matching, which is the faster than the existing methods. Deep graphical model is proficient of learning interior explications of corneal disorders in satisfactory time to solve hard combinatoric incongruities using deep Boltzmann machines.

Keywords: corneal disorders, deep Boltzmann machines, deep graphical model, glaucoma, neural networks, ophthalmic hashing

Procedia PDF Downloads 248
28835 Critical Analysis of Heat Exchanger Cycle for its Maintainability Using Failure Modes and Effect Analysis and Pareto Analysis

Authors: Sayali Vyas, Atharva Desai, Shreyas Badave, Apurv Kulkarni, B. Rajiv

Abstract:

The Failure Modes and Effect Analysis (FMEA) is an efficient evaluation technique to identify potential failures in products, processes, and services. FMEA is designed to identify and prioritize failure modes. It proves to be a useful method for identifying and correcting possible failures at its earliest possible level so that one can avoid consequences of poor performance. In this paper, FMEA tool is used in detection of failures of various components of heat exchanger cycle and to identify critical failures of the components which may hamper the system’s performance. Further, a detailed Pareto analysis is done to find out the most critical components of the cycle, the causes of its failures, and possible recommended actions. This paper can be used as a checklist which will help in maintainability of the system.

Keywords: FMEA, heat exchanger cycle, Ishikawa diagram, pareto analysis, RPN (Risk Priority Number)

Procedia PDF Downloads 398
28834 Artificial Intelligence Based Predictive Models for Short Term Global Horizontal Irradiation Prediction

Authors: Kudzanayi Chiteka, Wellington Makondo

Abstract:

The whole world is on the drive to go green owing to the negative effects of burning fossil fuels. Therefore, there is immediate need to identify and utilise alternative renewable energy sources. Among these energy sources solar energy is one of the most dominant in Zimbabwe. Solar power plants used to generate electricity are entirely dependent on solar radiation. For planning purposes, solar radiation values should be known in advance to make necessary arrangements to minimise the negative effects of the absence of solar radiation due to cloud cover and other naturally occurring phenomena. This research focused on the prediction of Global Horizontal Irradiation values for the sixth day given values for the past five days. Artificial intelligence techniques were used in this research. Three models were developed based on Support Vector Machines, Radial Basis Function, and Feed Forward Back-Propagation Artificial neural network. Results revealed that Support Vector Machines gives the best results compared to the other two with a mean absolute percentage error (MAPE) of 2%, Mean Absolute Error (MAE) of 0.05kWh/m²/day root mean square (RMS) error of 0.15kWh/m²/day and a coefficient of determination of 0.990. The other predictive models had prediction accuracies of MAPEs of 4.5% and 6% respectively for Radial Basis Function and Feed Forward Back-propagation Artificial neural network. These two models also had coefficients of determination of 0.975 and 0.970 respectively. It was found that prediction of GHI values for the future days is possible using artificial intelligence-based predictive models.

Keywords: solar energy, global horizontal irradiation, artificial intelligence, predictive models

Procedia PDF Downloads 269
28833 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments

Authors: Skyler Kim

Abstract:

An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.

Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning

Procedia PDF Downloads 183
28832 A Bayesian Network Approach to Customer Loyalty Analysis: A Case Study of Home Appliances Industry in Iran

Authors: Azam Abkhiz, Abolghasem Nasir

Abstract:

To achieve sustainable competitive advantage in the market, it is necessary to provide and improve customer satisfaction and Loyalty. To reach this objective, companies need to identify and analyze their customers. Thus, it is critical to measure the level of customer satisfaction and Loyalty very carefully. This study attempts to build a conceptual model to provide clear insights of customer loyalty. Using Bayesian networks (BNs), a model is proposed to evaluate customer loyalty and its consequences, such as repurchase and positive word-of-mouth. BN is a probabilistic approach that predicts the behavior of a system based on observed stochastic events. The most relevant determinants of customer loyalty are identified by the literature review. Perceived value, service quality, trust, corporate image, satisfaction, and switching costs are the most important variables that explain customer loyalty. The data are collected by use of a questionnaire-based survey from 1430 customers of a home appliances manufacturer in Iran. Four scenarios and sensitivity analyses are performed to run and analyze the impact of different determinants on customer loyalty. The proposed model allows businesses to not only set their targets but proactively manage their customer behaviors as well.

Keywords: customer satisfaction, customer loyalty, Bayesian networks, home appliances industry

Procedia PDF Downloads 135
28831 Comparison between Pushover Analysis Techniques and Validation of the Simplified Modal Pushover Analysis

Authors: N. F. Hanna, A. M. Haridy

Abstract:

One of the main drawbacks of the Modal Pushover Analysis (MPA) is the need to perform nonlinear time-history analysis, which complicates the analysis method and time. A simplified version of the MPA has been proposed based on the concept of the inelastic deformation ratio. Furthermore, the effect of the higher modes of vibration is considered by assuming linearly-elastic responses, which enables the use of standard elastic response spectrum analysis. In this thesis, the simplified MPA (SMPA) method is applied to determine the target global drift and the inter-story drifts of steel frame building. The effect of the higher vibration modes is considered within the framework of the SMPA. A comprehensive survey about the inelastic deformation ratio is presented. After that, a suitable expression from literature is selected for the inelastic deformation ratio and then implemented in the SMPA. The estimated seismic demands using the SMPA, such as target drift, base shear, and the inter-story drifts, are compared with the seismic responses determined by applying the standard MPA. The accuracy of the estimated seismic demands is validated by comparing with the results obtained by the nonlinear time-history analysis using real earthquake records.

Keywords: modal analysis, pushover analysis, seismic performance, target displacement

Procedia PDF Downloads 360
28830 Sedimentary, Diagenesis and Evaluation of High Quality Reservoir of Coarse Clastic Rocks in Nearshore Deep Waters in the Dongying Sag; Bohai Bay Basin

Authors: Kouassi Louis Kra

Abstract:

The nearshore deep-water gravity flow deposits in the Northern steep slope of Dongying depression, Bohai Bay basin, have been acknowledged as important reservoirs in the rift lacustrine basin. These deep strata term as coarse clastic sediment, deposit at the root of the slope have complex depositional processes and involve wide diagenetic events which made high-quality reservoir prediction to be complex. Based on the integrated study of seismic interpretation, sedimentary analysis, petrography, cores samples, wireline logging data, 3D seismic and lithological data, the reservoir formation mechanism deciphered. The Geoframe software was used to analyze 3-D seismic data to interpret the stratigraphy and build a sequence stratigraphic framework. Thin section identification, point counts were performed to assess the reservoir characteristics. The software PetroMod 1D of Schlumberger was utilized for the simulation of burial history. CL and SEM analysis were performed to reveal diagenesis sequences. Backscattered electron (BSE) images were recorded for definition of the textural relationships between diagenetic phases. The result showed that the nearshore steep slope deposits mainly consist of conglomerate, gravel sandstone, pebbly sandstone and fine sandstone interbedded with mudstone. The reservoir is characterized by low-porosity and ultra-low permeability. The diagenesis reactions include compaction, precipitation of calcite, dolomite, kaolinite, quartz cement and dissolution of feldspars and rock fragment. The main types of reservoir space are primary intergranular pores, residual intergranular pores, intergranular dissolved pores, intergranular dissolved pores, and fractures. There are three obvious anomalous high-porosity zones in the reservoir. Overpressure and early hydrocarbon filling are the main reason for abnormal secondary pores development. Sedimentary facies control the formation of high-quality reservoir, oil and gas filling preserves secondary pores from late carbonate cementation.

Keywords: Bohai Bay, Dongying Sag, deep strata, formation mechanism, high-quality reservoir

Procedia PDF Downloads 132
28829 Failure Analysis and Fatigue Life Estimation of a Shaft of a Rotary Draw Bending Machine

Authors: B. Engel, Sara Salman Hassan Al-Maeeni

Abstract:

Human consumption of the Earth's resources increases the need for a sustainable development as an important ecological, social, and economic theme. Re-engineering of machine tools, in terms of design and failure analysis, is defined as steps performed on an obsolete machine to return it to a new machine with the warranty that matches the customer requirement. To understand the future fatigue behavior of the used machine components, it is important to investigate the possible causes of machine parts failure through design, surface, and material inspections. In this study, the failure modes of the shaft of the rotary draw bending machine are inspected. Furthermore, stress and deflection analysis of the shaft subjected to combined torsion and bending loads are carried out by an analytical method and compared with a finite element analysis method. The theoretical fatigue strength, correction factors, and fatigue life sustained by the shaft before damaged are estimated by creating a stress-cycle (S-N) diagram. In conclusion, it is seen that the shaft can work in the second life, but it needs some surface treatments to increase the reliability and fatigue life.

Keywords: failure analysis, fatigue life, FEM analysis, shaft, stress analysis

Procedia PDF Downloads 293
28828 A Proposed Approach for Emotion Lexicon Enrichment

Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees

Abstract:

Document Analysis is an important research field that aims to gather the information by analyzing the data in documents. As one of the important targets for many fields is to understand what people actually want, sentimental analysis field has been one of the vital fields that are tightly related to the document analysis. This research focuses on analyzing text documents to classify each document according to its opinion. The aim of this research is to detect the emotions from text documents based on enriching the lexicon with adapting their content based on semantic patterns extraction. The proposed approach has been presented, and different experiments are applied by different perspectives to reveal the positive impact of the proposed approach on the classification results.

Keywords: document analysis, sentimental analysis, emotion detection, WEKA tool, NRC lexicon

Procedia PDF Downloads 439
28827 Application of Fractional Model Predictive Control to Thermal System

Authors: Aymen Rhouma, Khaled Hcheichi, Sami Hafsi

Abstract:

The article presents an application of Fractional Model Predictive Control (FMPC) to a fractional order thermal system using Controlled Auto Regressive Integrated Moving Average (CARIMA) model obtained by discretization of a continuous fractional differential equation. Moreover, the output deviation approach is exploited to design the K -step ahead output predictor, and the corresponding control law is obtained by solving a quadratic cost function. Experiment results onto a thermal system are presented to emphasize the performances and the effectiveness of the proposed predictive controller.

Keywords: fractional model predictive control, fractional order systems, thermal system, predictive control

Procedia PDF Downloads 406
28826 Hydroinformatics of Smart Cities: Real-Time Water Quality Prediction Model Using a Hybrid Approach

Authors: Elisa Coraggio, Dawei Han, Weiru Liu, Theo Tryfonas

Abstract:

Water is one of the most important resources for human society. The world is currently undergoing a wave of urban growth, and pollution problems are of a great impact. Monitoring water quality is a key task for the future of the environment and human species. In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for environmental monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the artificial intelligence algorithm. This study derives the methodology and demonstrates its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for the environment monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a new methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the Artificial Intelligence algorithm. This study derives the methodology and demonstrate its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.

Keywords: artificial intelligence, hydroinformatics, numerical modelling, smart cities, water quality

Procedia PDF Downloads 182
28825 Smart Laboratory for Clean Rivers in India - An Indo-Danish Collaboration

Authors: Nikhilesh Singh, Shishir Gaur, Anitha K. Sharma

Abstract:

Climate change and anthropogenic stress have severely affected ecosystems all over the globe. Indian rivers are under immense pressure, facing challenges like pollution, encroachment, extreme fluctuation in the flow regime, local ignorance and lack of coordination between stakeholders. To counter all these issues a holistic river rejuvenation plan is needed that tests, innovates and implements sustainable solutions in the river space for sustainable river management. Smart Laboratory for Clean Rivers (SLCR) an Indo-Danish collaboration project, provides a living lab setup that brings all the stakeholders (government agencies, academic and industrial partners and locals) together to engage, learn, co-creating and experiment for a clean and sustainable river that last for ages. Just like every mega project requires piloting, SLCR has opted for a small catchment of the Varuna River, located in the Middle Ganga Basin in India. Considering the integrated approach of river rejuvenation, SLCR embraces various techniques and upgrades for rejuvenation. Likely, maintaining flow in the channel in the lean period, Managed Aquifer Recharge (MAR) is a proven technology. In SLCR, Floa-TEM high-resolution lithological data is used in MAR models to have better decision-making for MAR structures nearby of the river to enhance the river aquifer exchanges. Furthermore, the concerns of quality in the river are a big issue. A city like Varanasi which is located in the last stretch of the river, generates almost 260 MLD of domestic waste in the catchment. The existing STP system is working at full efficiency. Instead of installing a new STP for the future, SLCR is upgrading those STPs with an IoT-based system that optimizes according to the nutrient load and energy consumption. SLCR also advocate nature-based solutions like a reed bed for the drains having less flow. In search of micropollutants, SLCR uses fingerprint analysis involves employing advanced techniques like chromatography and mass spectrometry to create unique chemical profiles. However, rejuvenation attempts cannot be possible without involving the entire catchment. A holistic water management plan that includes storm management, water harvesting structure to efficiently manage the flow of water in the catchment and installation of several buffer zones to restrict pollutants entering into the river. Similarly, carbon (emission and sequestration) is also an important parameter for the catchment. By adopting eco-friendly practices, a ripple effect positively influences the catchment's water dynamics and aids in the revival of river systems. SLCR has adopted 4 villages to make them carbon-neutral and water-positive. Moreover, for the 24×7 monitoring of the river and the catchment, robust IoT devices are going to be installed to observe, river and groundwater quality, groundwater level, river discharge and carbon emission in the catchment and ultimately provide fuel for the data analytics. In its completion, SLCR will provide a river restoration manual, which will strategise the detailed plan and way of implementation for stakeholders. Lastly, the entire process is planned in such a way that will be managed by local administrations and stakeholders equipped with capacity-building activity. This holistic approach makes SLCR unique in the field of river rejuvenation.

Keywords: sustainable management, holistic approach, living lab, integrated river management

Procedia PDF Downloads 52