Search results for: output buffer amplifier
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2429

Search results for: output buffer amplifier

29 Analyzing the Effectiveness of Elderly Design and the Impact on Sustainable Built Environment

Authors: Tristance Kee

Abstract:

With an unprecedented increase in elderly population around the world, the severe lack of quality housing and health-and-safety provisions to serve this cohort cannot be ignored any longer. Many elderly citizens, especially singletons, live in unsafe housing conditions with poorly executed planning and design. Some suffer from deteriorating mobility, sight and general alertness and their sub-standard living conditions further hinder their daily existence. This research explains how concepts such as Universal Design and Co-Design operate in a high density city such as Hong Kong, China where innovative design can become an alternative solution where government and the private sector fail to provide quality elderly friendly facilities to promote a sustainable urban development. Unlike other elderly research which focuses more on housing policies, nursing care and theories, this research takes a more progressive approach by providing an in-depth impact assessment on how innovative design can be practical solutions for creating a more sustainable built environment. The research objectives are to: 1) explain the relationship between innovative design for elderly and a healthier and sustainable environment; 2) evaluate the impact of human ergonomics with the use of universal design; and 3) explain how innovation can enhance the sustainability of a city in improving citizen’s sight, sound, walkability and safety within the ageing population. The research adopts both qualitative and quantitative methodologies to examine ways to improve elderly population’s relationship to our built environment. In particular, the research utilizes collected data from questionnaire survey and focus group discussions to obtain inputs from various stakeholders, including designers, operators and managers related to public housing, community facilities and overall urban development. In addition to feedbacks from end-users and stakeholders, a thorough analysis on existing elderly housing facilities and Universal Design provisions are examined to evaluate their adequacy. To echo the theme of this conference on Innovation and Sustainable Development, this research examines the effectiveness of innovative design in a risk-benefit factor assessment. To test the hypothesis that innovation can cater for a sustainable development, the research evaluated the health improvement of a sample size of 150 elderly in a period of eight months. Their health performances, including mobility, speech and memory are monitored and recorded on a regular basis to assess if the use of innovation does trigger impact on improving health and home safety for an elderly cohort. This study was supported by district community centers under the auspices of Home Affairs Bureau to provide respondents for questionnaire survey, a standardized evaluation mechanism, and professional health care staff for evaluating the performance impact. The research findings will be integrated to formulate design solutions such as innovative home products to improve elderly daily experience and safety with a particular focus on the enhancement on sight, sound and mobility safety. Some policy recommendations and architectural planning recommendations related to Universal Design will also be incorporated into the research output for future planning of elderly housing and amenity provisions.

Keywords: elderly population, innovative design, sustainable built environment, universal design

Procedia PDF Downloads 201
28 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions

Authors: Joel Niklaus, Matthias Sturmer

Abstract:

The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.

Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling

Procedia PDF Downloads 108
27 Multi-Model Super Ensemble Based Advanced Approaches for Monsoon Rainfall Prediction

Authors: Swati Bhomia, C. M. Kishtawal, Neeru Jaiswal

Abstract:

Traditionally, monsoon forecasts have encountered many difficulties that stem from numerous issues such as lack of adequate upper air observations, mesoscale nature of convection, proper resolution, radiative interactions, planetary boundary layer physics, mesoscale air-sea fluxes, representation of orography, etc. Uncertainties in any of these areas lead to large systematic errors. Global circulation models (GCMs), which are developed independently at different institutes, each of which carries somewhat different representation of the above processes, can be combined to reduce the collective local biases in space, time, and for different variables from different models. This is the basic concept behind the multi-model superensemble and comprises of a training and a forecast phase. The training phase learns from the recent past performances of models and is used to determine statistical weights from a least square minimization via a simple multiple regression. These weights are then used in the forecast phase. The superensemble forecasts carry the highest skill compared to simple ensemble mean, bias corrected ensemble mean and the best model out of the participating member models. This approach is a powerful post-processing method for the estimation of weather forecast parameters reducing the direct model output errors. Although it can be applied successfully to the continuous parameters like temperature, humidity, wind speed, mean sea level pressure etc., in this paper, this approach is applied to rainfall, a parameter quite difficult to handle with standard post-processing methods, due to its high temporal and spatial variability. The present study aims at the development of advanced superensemble schemes comprising of 1-5 day daily precipitation forecasts from five state-of-the-art global circulation models (GCMs), i.e., European Centre for Medium Range Weather Forecasts (Europe), National Center for Environmental Prediction (USA), China Meteorological Administration (China), Canadian Meteorological Centre (Canada) and U.K. Meteorological Office (U.K.) obtained from THORPEX Interactive Grand Global Ensemble (TIGGE), which is one of the most complete data set available. The novel approaches include the dynamical model selection approach in which the selection of the superior models from the participating member models at each grid and for each forecast step in the training period is carried out. Multi-model superensemble based on the training using similar conditions is also discussed in the present study, which is based on the assumption that training with the similar type of conditions may provide the better forecasts in spite of the sequential training which is being used in the conventional multi-model ensemble (MME) approaches. Further, a variety of methods that incorporate a 'neighborhood' around each grid point which is available in literature to allow for spatial error or uncertainty, have also been experimented with the above mentioned approaches. The comparison of these schemes with respect to the observations verifies that the newly developed approaches provide more unified and skillful prediction of the summer monsoon (viz. June to September) rainfall compared to the conventional multi-model approach and the member models.

Keywords: multi-model superensemble, dynamical model selection, similarity criteria, neighborhood technique, rainfall prediction

Procedia PDF Downloads 111
26 Challenges, Practices, and Opportunities of Knowledge Management in Industrial Research Institutes: Lessons Learned from Flanders Make

Authors: Zhenmin Tao, Jasper De Smet, Koen Laurijssen, Jeroen Stuyts, Sonja Sioncke

Abstract:

Today, the quality of knowledge management (KM)become one of the underpinning factors in the success of an organization, as it determines the effectiveness of capitalizing the organization’s knowledge. Overall, KMin an organization consists of five aspects: (knowledge) creation, validation, presentation, distribution, and application. Among others, KM in research institutes is considered as the cornerstone as their activities cover all five aspects. Furthermore, KM in a research institute facilitates the steering committee to envision the future roadmap, identify knowledge gaps, and make decisions on future research directions. Likewise, KMis even more challenging in industrial research institutes. From a technical perspective, technology advancement in the past decades calls for combinations of breadth and depth in expertise that poses challenges in talent acquisition and, therefore, knowledge creation. From a regulatory perspective, the strict intellectual property protection from industry collaborators and/or the contractual agreements made by possible funding authoritiesform extra barriers to knowledge validation, presentation, and distribution. From a management perspective, seamless KM activities are only guaranteed by inter-disciplinary talents that combine technical background knowledge, management skills, and leadership, let alone international vision. From a financial perspective, the long feedback period of new knowledge, together with the massive upfront investment costs and low reusability of the fixed assets, lead to low RORC (return on research capital) that jeopardize KM practice. In this study, we aim to address the challenges, practices, and opportunitiesof KM in Flanders Make – a leading European research institute specialized in the manufacturing industry. In particular, the analyses encompass an internal KM project which involves functionalities ranging from management to technical domain experts. This wide range of functionalities provides comprehensive empirical evidence on the challenges and practices w.r.t.the abovementioned KMaspects. Then, we ground our analysis onto the critical dimensions ofKM–individuals, socio‐organizational processes, and technology. The analyses have three steps: First, we lay the foundation and define the environment of this study by briefing the KM roles played by different functionalities in Flanders Make. Second, we zoom in to the CoreLab MotionS where the KM project is located. In this step, given the technical domains covered by MotionS products, the challenges in KM will be addressed w.r.t. the five KM aspects and three critical dimensions. Third, by detailing the objectives, practices, results, and limitations of the MotionSKMproject, we justify the practices and opportunities derived in the execution ofKMw.r.t. the challenges addressed in the second step. The results of this study are twofold: First, a KM framework that consolidates past knowledge is developed. A library based on this framework can, therefore1) overlook past research output, 2) accelerate ongoing research activities, and 3) envision future research projects. Second, the challenges inKM on both individual (actions) level and socio-organizational level (e.g., interactions between individuals)are identified. By doing so, suggestions and guidelines will be provided in KM in the context of industrial research institute. To this end, the results in this study are reflected towards the findings in existing literature.

Keywords: technical knowledge management framework, industrial research institutes, individual knowledge management, socio-organizational knowledge management.

Procedia PDF Downloads 83
25 Hanta Virus Infection in a Child and Sequelae

Authors: Vijay Samuel, Tina Thekkekkara, Shoma Ganguly

Abstract:

There is no reported Hanta Seoul virus infection in children in the UK so far, making it quite challenging for clinicians in diagnosing, predicting and prognosticating the outcome of the infection to patients and parents. We report a case of a ten-year-old girl who presented with pyrexia associated with headache, photophobia and abdominal pain. The family had recently acquired two pet rats six weeks ago. She appeared flushed with peri-oral pallor, coated the strawberry tongue, inflamed tonsils and bilateral cervical lymphadenopathy. Her liver and splenic edges were palpable. Investigations showed that she was thrombocytopenic with deranged renal and liver functions. An ultrasound abdomen demonstrated a mildly enlarged spleen, peripancreatic lymph node and an acalculous cholecystitis. In view of her clinical presentation, a diagnosis of leptospirosis was considered and she was commenced on intravenous benzylpenicillin. The following day she became oliguric, developed significant proteinuria and her renal function deteriorated. Following conservative management, her urine output gradually improved along with her renal function, proteinuria and thrombocytopaenia. Serology for leptospirosis and various other viruses were negative. Following discussion with the Rare and Imported Pathogens Laboratory at Porton hanta virus serology was requested and found to be strongly positive for Seoul hanta virus. Following discharge she developed palpitations, fatigue, severe headache and cognitive difficulties including memory loss and difficulties in spelling, reading and mathematics. Extensive investigations including ECG, MRI brain and CSF studies were performed and revealed no significant abnormalities. Since 2012, there have been six cases of acute kidney injury due to Hantavirus infection in the UK. Two cases were from the Humber region and were exposure to wild rats and the other four were exposed to specially bred pet fancy rats. Hanta virus infections can cause mild flu like symptoms but two clinical syndromes are associated with severe disease including haemorrhagic fever with renal syndrome, which may be associated with thrombocytopenia and Hantavirus cardiopulmonary syndrome. Neuropsychological impairments reported following hantavirus pulmonary syndrome and following Puumala virus infection have been reported. Minor white matter lesions were found in about half of the patients investigated with MRI brain. Seoul virus has a global distribution owing to the dispersal of its carrier host rats, through global trade. Several ports in the region could explain the possible establishment of Seoul virus in local populations of rats in the Yorkshire and Humber region. The risk of infection for occupationally exposed groups is 1-3% compared to 32.9% for specialist pet rat owners. The report highlight’s the importance of routinely asking about pets in the family. We hope to raise awareness of the emergence of hantavirus infection in the UK, particularly in the Yorkshire and Humber region. Clinicians should consider hantavirus infection as a potential cause of febrile illness causing renal impairment in children. Awareness of the possible neuro-cognitive sequele would help the clinicians offer appropriate information and support to children and their families. Contacting Rare and Imported Pathogens Laboratory at Porton is a useful resource for clinicians in UK when they consider unusual infections.

Keywords: Seoul hantavirus in child Porton, UK Acute kidney injury

Procedia PDF Downloads 269
24 Planning Railway Assets Renewal with a Multiobjective Approach

Authors: João Coutinho-Rodrigues, Nuno Sousa, Luís Alçada-Almeida

Abstract:

Transportation infrastructure systems are fundamental in modern society and economy. However, they need modernizing, maintaining, and reinforcing interventions which require large investments. In many countries, accumulated intervention delays arise from aging and intense use, being magnified by financial constraints of the past. The decision problem of managing the renewal of large backlogs is common to several types of important transportation infrastructures (e.g., railways, roads). This problem requires considering financial aspects as well as operational constraints under a multidimensional framework. The present research introduces a linear programming multiobjective model for managing railway infrastructure asset renewal. The model aims at minimizing three objectives: (i) yearly investment peak, by evenly spreading investment throughout multiple years; (ii) total cost, which includes extra maintenance costs incurred from renewal backlogs; (iii) priority delays related to work start postponements on the higher priority railway sections. Operational constraints ensure that passenger and freight services are not excessively delayed from having railway line sections under intervention. Achieving a balanced annual investment plan, without compromising the total financial effort or excessively postponing the execution of the priority works, was the motivation for pursuing the research which is now presented. The methodology, inspired by a real case study and tested with real data, reflects aspects of the practice of an infrastructure management company and is generalizable to different types of infrastructure (e.g., railways, highways). It was conceived for treating renewal interventions in infrastructure assets, which is a railway network may be rails, ballasts, sleepers, etc.; while a section is under intervention, trains must run at reduced speed, causing delays in services. The model cannot, therefore, allow for an accumulation of works on the same line, which may cause excessively large delays. Similarly, the lines do not all have the same socio-economic importance or service intensity, making it is necessary to prioritize the sections to be renewed. The model takes these issues into account, and its output is an optimized works schedule for the renewal project translatable in Gantt charts The infrastructure management company provided all the data for the first test case study and validated the parameterization. This case consists of several sections to be renewed, over 5 years and belonging to 17 lines. A large instance was also generated, reflecting a problem of a size similar to the USA railway network (considered the largest one in the world), so it is not expected that considerably larger problems appear in real life; an average of 25 years backlog and ten years of project horizon was considered. Despite the very large increase in the number of decision variables (200 times as large), the computational time cost did not increase very significantly. It is thus expectable that just about any real-life problem can be treated in a modern computer, regardless of size. The trade-off analysis shows that if the decision maker allows some increase in max yearly investment (i.e., degradation of objective ii), solutions improve considerably in the remaining two objectives.

Keywords: transport infrastructure, asset renewal, railway maintenance, multiobjective modeling

Procedia PDF Downloads 121
23 IoT Continuous Monitoring Biochemical Oxygen Demand Wastewater Effluent Quality: Machine Learning Algorithms

Authors: Sergio Celaschi, Henrique Canavarro de Alencar, Claaudecir Biazoli

Abstract:

Effluent quality is of the highest priority for compliance with the permit limits of environmental protection agencies and ensures the protection of their local water system. Of the pollutants monitored, the biochemical oxygen demand (BOD) posed one of the greatest challenges. This work presents a solution for wastewater treatment plants - WWTP’s ability to react to different situations and meet treatment goals. Delayed BOD5 results from the lab take 7 to 8 analysis days, hindered the WWTP’s ability to react to different situations and meet treatment goals. Reducing BOD turnaround time from days to hours is our quest. Such a solution is based on a system of two BOD bioreactors associated with Digital Twin (DT) and Machine Learning (ML) methodologies via an Internet of Things (IoT) platform to monitor and control a WWTP to support decision making. DT is a virtual and dynamic replica of a production process. DT requires the ability to collect and store real-time sensor data related to the operating environment. Furthermore, it integrates and organizes the data on a digital platform and applies analytical models allowing a deeper understanding of the real process to catch sooner anomalies. In our system of continuous time monitoring of the BOD suppressed by the effluent treatment process, the DT algorithm for analyzing the data uses ML on a chemical kinetic parameterized model. The continuous BOD monitoring system, capable of providing results in a fraction of the time required by BOD5 analysis, is composed of two thermally isolated batch bioreactors. Each bioreactor contains input/output access to wastewater sample (influent and effluent), hydraulic conduction tubes, pumps, and valves for batch sample and dilution water, air supply for dissolved oxygen (DO) saturation, cooler/heater for sample thermal stability, optical ODO sensor based on fluorescence quenching, pH, ORP, temperature, and atmospheric pressure sensors, local PLC/CPU for TCP/IP data transmission interface. The dynamic BOD system monitoring range covers 2 mg/L < BOD < 2,000 mg/L. In addition to the BOD monitoring system, there are many other operational WWTP sensors. The CPU data is transmitted/received to/from the digital platform, which in turn performs analyses at periodic intervals, aiming to feed the learning process. BOD bulletins and their credibility intervals are made available in 12-hour intervals to web users. The chemical kinetics ML algorithm is composed of a coupled system of four first-order ordinary differential equations for the molar masses of DO, organic material present in the sample, biomass, and products (CO₂ and H₂O) of the reaction. This system is solved numerically linked to its initial conditions: DO (saturated) and initial products of the kinetic oxidation process; CO₂ = H₂0 = 0. The initial values for organic matter and biomass are estimated by the method of minimization of the mean square deviations. A real case of continuous monitoring of BOD wastewater effluent quality is being conducted by deploying an IoT application on a large wastewater purification system located in S. Paulo, Brazil.

Keywords: effluent treatment, biochemical oxygen demand, continuous monitoring, IoT, machine learning

Procedia PDF Downloads 44
22 Participatory Action Research for Sustainability with Special Focus on Student Initiatives

Authors: Soni T. L.

Abstract:

Sustainable environmental stress is a major concern which needs immediate attention. This paper is an attempt to present participatory action research for sustainable agriculture. Being first and best culture, agriculture protects and improves the natural environment, the social and economic conditions of people, and safeguards the health and welfare of all groups. During course of time agriculture turned to agribusiness, then the values are not safeguarded. Moreover, in today’s busy life many are not taking efforts to take part in agriculture production. Then children are not getting the opportunity to understand agriculture and farming practices. So student initiatives are vital to make them aware. Here the programmes structured by the researcher come under the auspicious of National Service Scheme, a student-centered educational programme, organized by Ministry of Youth Affairs, Government of India. The twin objectives of the study are to examine the role of student initiatives for sustainable agriculture and the role of participatory action research in student initiatives. SWOT analysis is made to study strengths, weaknesses, threats and opportunities. The Methodology adopted is Participatory Action Research. The method is participatory in a sense there is collaboration through participation. The method is action, there is lab land experiences which is real. The method is research that there is documented lessons and creation of new knowledge. Plan of action cover measures adopted and strategies taken i.e., bhavana – kalpana – yojana – sadhana. Through the team effort, the team was successful in converting more than 10 hectares of barren land into cultivable land within and outside the campus. Team efforts of students saved a huge amount of labour cost and produced a huge quantity of organic output and the team was also successful in creating 1000 rain pits in the premises of College for rainwater harvesting. The findings include conveyance of the Message: Food Production is superior to Food donation. Moreover, the study fostered good work ethic and social responsibility among students. Students undertake innovative programmes underlying social and environmental issues and participants got increased opportunities to interact with local and less privileged and acquired increased awareness about real-life experiences which make them confident to interact with people and it resulted in the strengthening of social capital- cooperation, team spirit, social commitment among students. Participants promoted sustainable domestic efforts and ultimately environmental protection is ensured. Finally, there is recognition to the team, institution and the researcher at the university level, state level and at the national level. The learned lessons are, if the approach is good, the response is good and success generates success. Participatory action research is empowering experience for practitioners, focusing the combined time, energy and creativity of a committed group we should lead so many programmes which makes the institution centre of excellence. Authorities should take necessary steps for the Inclusion of community development activities in the curriculum. Action research is problem, client and action centered. So, we must adapt and adopt, coordinates and correlates measures which preserve and conserve the environment.

Keywords: participatory action research, student initiatives, sustainable development, sustainability

Procedia PDF Downloads 134
21 Optimizing Productivity and Quality through the Establishment of a Learning Management System for an Agency-Based Graduate School

Authors: Maria Corazon Tapang-Lopez, Alyn Joy Dela Cruz Baltazar, Bobby Jones Villanueva Domdom

Abstract:

The requisite for an organization implementing quality management system to sustain its compliance to the requirements and commitment for continuous improvement is even higher. It is expected that the offices and units has high and consistent compliance to the established processes and procedures. The Development Academy of the Philippines has been operating under project management to which is has a quality management certification. To further realize its mandate as a think-tank and capacity builder of the government, DAP expanded its operation and started to grant graduate degree through its Graduate School of Public and Development Management (GSPDM). As the academic arm of the Academy, GSPDM offers graduate degree programs on public management and productivity & quality aligned to the institutional trusts. For a time, the documented procedures and processes of a project management seem to fit the Graduate School. However, there has been a significant growth in the operations of the GSPDM in terms of the graduate programs offered that directly increase the number of students. There is an apparent necessity to align the project management system into a more educational system otherwise it will no longer be responsive to the development that are taking place. The strongly advocate and encourage its students to pursue internal and external improvement to cope up with the challenges of providing quality service to their own clients and to our country. If innovation will not take roots in the grounds of GSPDM, then how will it serve the purpose of “walking the talk”? This research was conducted to assess the diverse flow of the existing internal operations and processes of the DAP’s project management and GSPDM’s school management that will serve as basis to develop a system that will harmonize into one, the Learning Management System. The study documented the existing process of GSPDM following the project management phases of conceptualization & development, negotiation & contracting, mobilization, implementation, and closure into different flow charts of the key activities. The primary source of information as respondents were the different groups involved into the delivery of graduate programs - the executive, learning management team and administrative support offices. The Learning Management System (LMS) shall capture the unique and critical processes of the GSPDM as a degree-granting unit of the Academy. The LMS is the harmonized project management and school management system that shall serve as the standard system and procedure for all the programs within the GSPDM. The unique processes cover the three important areas of school management – student, curriculum, and faculty. The required processes of these main areas such as enrolment, course syllabus development, and faculty evaluation were appropriately placed within the phases of the project management system. Further, the research shall identify critical reports and generate manageable documents and records to ensure accuracy, consistency and reliable information. The researchers had an in-depth review of the DAP-GSDPM’s mandate, analyze the various documents, and conducted series of focused group discussions. A comprehensive review on flow chart system prior and various models of school management systems were made. Subsequently, the final output of the research is a work instructions manual that will be presented to the Academy’s Quality Management Council and eventually an additional scope for ISO certification. The manual shall include documented forms, iterative flow charts and program Gantt chart that will have a parallel development of automated systems.

Keywords: productivity, quality, learning management system, agency-based graduate school

Procedia PDF Downloads 292
20 An Explorative Analysis of Effective Project Management of Research and Research-Related Projects within a recently Formed Multi-Campus Technology University

Authors: Àidan Higgins

Abstract:

Higher education will be crucial in the coming decades in helping to make Ireland a nation is known for innovation, competitive enterprise, and ongoing academic success, as well as a desirable location to live and work with a high quality of life, vibrant culture, and inclusive social structures. Higher education institutions will actively connect with each student community, society, and business; they will help students develop a sense of place and identity in Ireland and provide the tools they need to contribute significantly to the global community. It will also serve as a catalyst for novel ideas through research, many of which will become the foundation for long-lasting inventive businesses in the future as part of the 2030 National Strategy on Education focuses on change and developing our education system with a focus on how we carry out Research. The emphasis is central to knowledge transfer and a consistent research framework with exploiting opportunities and having the necessary expertise. The newly formed Technological Universities (TU) in Ireland are based on a government initiative to create a new type of higher education institution that focuses on applied and industry-focused research and education. The basis of the TU is to bring together two or more existing institutes of technology to create a larger and more comprehensive institution that offers a wider range of programs and services to students and industry partners. The TU model aims to promote collaboration between academia, industry, and community organizations to foster innovation, research, and economic development. The TU model also aims to enhance the student experience by providing a more seamless pathway from undergraduate to postgraduate studies, as well as greater opportunities for work placements and engagement with industry partners. Additionally, the TUs are designed to provide a greater emphasis on applied research, technology transfer, and entrepreneurship, with the goal of fostering innovation and contributing to economic growth. A project is a collection of organised tasks carried out precisely to produce a singular output (product or service) within a given time frame. Project management is a set of activities that facilitates the successful implementation of a project. The significant differences between research and development projects are the (lack of) precise requirements and (the inability to) plan an outcome from the beginning of the project. The evaluation criteria for a research project must consider these and other "particularities" in works; for instance, proving something cannot be done may be a successful outcome. This study intends to explore how a newly established multi-campus technological university manages research projects effectively. The study will identify the potential and difficulties of managing research projects, the tools, resources and processes available in a multi-campus Technological University context and the methods and approaches employed to deal with these difficulties. Key stakeholders like project managers, academics, and administrators will be surveyed as part of the study, which will also involve an explorative investigation of current literature and data. The findings of this study will contribute significantly to creating best practices for project management in this setting and offer insightful information about the efficient management of research projects within a multi-campus technological university.

Keywords: project management, research and research-related projects, multi-campus technology university, processes

Procedia PDF Downloads 37
19 Auto Rickshaw Impacts with Pedestrians: A Computational Analysis of Post-Collision Kinematics and Injury Mechanics

Authors: A. J. Al-Graitti, G. A. Khalid, P. Berthelson, A. Mason-Jones, R. Prabhu, M. D. Jones

Abstract:

Motor vehicle related pedestrian road traffic collisions are a major road safety challenge, since they are a leading cause of death and serious injury worldwide, contributing to a third of the global disease burden. The auto rickshaw, which is a common form of urban transport in many developing countries, plays a major transport role, both as a vehicle for hire and for private use. The most common auto rickshaws are quite unlike ‘typical’ four-wheel motor vehicle, being typically characterised by three wheels, a non-tilting sheet-metal body or open frame construction, a canvas roof and side curtains, a small drivers’ cabin, handlebar controls and a passenger space at the rear. Given the propensity, in developing countries, for auto rickshaws to be used in mixed cityscapes, where pedestrians and vehicles share the roadway, the potential for auto rickshaw impacts with pedestrians is relatively high. Whilst auto rickshaws are used in some Western countries, their limited number and spatial separation from pedestrian walkways, as a result of city planning, has not resulted in significant accident statistics. Thus, auto rickshaws have not been subject to the vehicle impact related pedestrian crash kinematic analyses and/or injury mechanics assessment, typically associated with motor vehicle development in Western Europe, North America and Japan. This study presents a parametric analysis of auto rickshaw related pedestrian impacts by computational simulation, using a Finite Element model of an auto rickshaw and an LS-DYNA 50th percentile male Hybrid III Anthropometric Test Device (dummy). Parametric variables include auto rickshaw impact velocity, auto rickshaw impact region (front, centre or offset) and relative pedestrian impact position (front, side and rear). The output data of each impact simulation was correlated against reported injury metrics, Head Injury Criterion (front, side and rear), Neck injury Criterion (front, side and rear), Abbreviated Injury Scale and reported risk level and adds greater understanding to the issue of auto rickshaw related pedestrian injury risk. The parametric analyses suggest that pedestrians are subject to a relatively high risk of injury during impacts with an auto rickshaw at velocities of 20 km/h or greater, which during some of the impact simulations may even risk fatalities. The present study provides valuable evidence for informing a series of recommendations and guidelines for making the auto rickshaw safer during collisions with pedestrians. Whilst it is acknowledged that the present research findings are based in the field of safety engineering and may over represent injury risk, compared to “Real World” accidents, many of the simulated interactions produced injury response values significantly greater than current threshold curves and thus, justify their inclusion in the study. To reduce the injury risk level and increase the safety of the auto rickshaw, there should be a reduction in the velocity of the auto rickshaw and, or, consideration of engineering solutions, such as retro fitting injury mitigation technologies to those auto rickshaw contact regions which are the subject of the greatest risk of producing pedestrian injury.

Keywords: auto rickshaw, finite element analysis, injury risk level, LS-DYNA, pedestrian impact

Procedia PDF Downloads 167
18 Renewable Energy Micro-Grid Control Using Microcontroller in LabVIEW

Authors: Meena Agrawal, Chaitanya P. Agrawal

Abstract:

The power systems are transforming and becoming smarter with innovations in technologies to enable embark simultaneously upon the sustainable energy needs, rising environmental concerns, economic benefits and quality requirements. The advantages provided by inter-connection of renewable energy resources are becoming more viable and dependable with the smart controlling technologies. The limitation of most renewable resources have their diversity and intermittency causing problems in power quality, grid stability, reliability, security etc. is being cured by these efforts. A necessitate of optimal energy management by intelligent Micro-Grids at the distribution end of the power system has been accredited to accommodate sustainable renewable Distributed Energy Resources on large scale across the power grid. All over the world Smart Grids are emerging now as foremost concern infrastructure upgrade programs. The hardware setup includes NI cRIO 9022, Compact Reconfigurable Input Output microcontroller board connected to the PC on a LAN router with three hardware modules. The Real-Time Embedded Controller is reconfigurable controller device consisting of an embedded real-time processor controller for communication and processing, a reconfigurable chassis housing the user-programmable FPGA, Eight hot-swappable I/O modules, and graphical LabVIEW system design software. It has been employed for signal analysis, controls and acquisition and logging of the renewable sources with the LabVIEW Real-Time applications. The employed cRIO chassis controls the timing for the module and handles communication with the PC over the USB, Ethernet, or 802.11 Wi-Fi buses. It combines modular I/O, real-time processing, and NI LabVIEW programmable. In the presented setup, the Analog Input Module NI 9205 five channels have been used for input analog voltage signals from renewable energy sources and NI 9227 four channels have been used for input analog current signals of the renewable sources. For switching actions based on the programming logic developed in software, a module having Electromechanical Relays (single-pole single throw) with 4-Channels, electrically isolated and LED indicating the state of that channel have been used for isolating the renewable Sources on fault occurrence, which is decided by the logic in the program. The module for Ethernet based Data Acquisition Interface ENET 9163 Ethernet Carrier, which is connected on the LAN Router for data acquisition from a remote source over Ethernet also has the module NI 9229 installed. The LabVIEW platform has been employed for efficient data acquisition, monitoring and control. Control logic utilized in program for operation of the hardware switching Related to Fault Relays has been portrayed as a flowchart. A communication system has been successfully developed amongst the sources and loads connected on different computers using Hypertext transfer protocol, HTTP or Ethernet Local Stacked area Network TCP/IP protocol. There are two main I/O interfacing clients controlling the operation of the switching control of the renewable energy sources over internet or intranet. The paper presents experimental results of the briefed setup for intelligent control of the micro-grid for renewable energy sources, besides the control of Micro-Grid with data acquisition and control hardware based on a microcontroller with visual program developed in LabVIEW.

Keywords: data acquisition and control, LabVIEW, microcontroller cRIO, Smart Micro-Grid

Procedia PDF Downloads 298
17 Design, Control and Implementation of 3.5 kW Bi-Directional Energy Harvester for Intelligent Green Energy Management System

Authors: P. Ramesh, Aby Joseph, Arya G. Lal, U. S. Aji

Abstract:

Integration of distributed green renewable energy sources in addition with battery energy storage is an inevitable requirement in a smart grid environment. To achieve this, an Intelligent Green Energy Management System (i-GEMS) needs to be incorporated to ensure coordinated operation between supply and load demand based on the hierarchy of Renewable Energy Sources (RES), battery energy storage and distribution grid. A bi-directional energy harvester is an integral component facilitating Intelligent Green Energy Management System (i-GEMS) and it is required to meet the technical challenges mentioned as follows: (1) capability for bi-directional mode of operation (buck/boost) (2) reduction of circuit parasitic to suppress voltage spikes (3) converter startup problem (4) high frequency magnetics (5) higher power density (6) mode transition issues during battery charging and discharging. This paper is focused to address the above mentioned issues and targeted to design, develop and implement a bi-directional energy harvester with galvanic isolation. In this work, the hardware architecture for bi-directional energy harvester rated 3.5 kW is developed with Isolated Full Bridge Boost Converter (IFBBC) as well as Dual Active Bridge (DAB) Converter configuration using modular power electronics hardware which is identical for both solar PV array and battery energy storage. In IFBBC converter, the current fed full bridge circuit is enabled and voltage fed full bridge circuit is disabled through Pulse Width Modulation (PWM) pulses for boost mode of operation and vice-versa for buck mode of operation. In DAB converter, all the switches are in active state so as to adjust the phase shift angle between primary full bridge and secondary full bridge which in turn decides the power flow directions depending on modes (boost/buck) of operation. Here, the control algorithm is developed to ensure the regulation of the common DC link voltage and maximum power extraction from the renewable energy sources depending on the selected mode (buck/boost) of operation. The circuit analysis and simulation study are conducted using PSIM 9.0 in three scenarios which are - 1.IFBBC with passive clamp, 2. IFBBC with active clamp, 3. DAB converter. In this work, a common hardware prototype for bi-directional energy harvester with 3.5 kW rating is built for IFBBC and DAB converter configurations. The power circuit is equipped with right choice of MOSFETs, gate drivers with galvanic isolation, high frequency transformer, filter capacitors, and filter boost inductor. The experiment was conducted for IFBBC converter with passive clamp under boost mode and the prototype confirmed the simulation results showing the measured efficiency as 88% at 2.5 kW output power. The digital controller hardware platform is developed using floating point microcontroller TMS320F2806x from Texas Instruments. The firmware governing the operation of the bi-directional energy harvester is written in C language and developed using code composer studio. The comprehensive analyses of the power circuit design, control strategy for battery charging/discharging under buck/boost modes and comparative performance evaluation using simulation and experimental results will be presented.

Keywords: bi-directional energy harvester, dual active bridge, isolated full bridge boost converter, intelligent green energy management system, maximum power point tracking, renewable energy sources

Procedia PDF Downloads 104
16 Braille Lab: A New Design Approach for Social Entrepreneurship and Innovation in Assistive Tools for the Visually Impaired

Authors: Claudio Loconsole, Daniele Leonardis, Antonio Brunetti, Gianpaolo Francesco Trotta, Nicholas Caporusso, Vitoantonio Bevilacqua

Abstract:

Unfortunately, many people still do not have access to communication, with specific regard to reading and writing. Among them, people who are blind or visually impaired, have several difficulties in getting access to the world, compared to the sighted. Indeed, despite technology advancement and cost reduction, nowadays assistive devices are still expensive such as Braille-based input/output systems which enable reading and writing texts (e.g., personal notes, documents). As a consequence, assistive technology affordability is fundamental in supporting the visually impaired in communication, learning, and social inclusion. This, in turn, has serious consequences in terms of equal access to opportunities, freedom of expression, and actual and independent participation to a society designed for the sighted. Moreover, the visually impaired experience difficulties in recognizing objects and interacting with devices in any activities of daily living. It is not a case that Braille indications are commonly reported only on medicine boxes and elevator keypads. Several software applications for the automatic translation of written text into speech (e.g., Text-To-Speech - TTS) enable reading pieces of documents. However, apart from simple tasks, in many circumstances TTS software is not suitable for understanding very complicated pieces of text requiring to dwell more on specific portions (e.g., mathematical formulas or Greek text). In addition, the experience of reading\writing text is completely different both in terms of engagement, and from an educational perspective. Statistics on the employment rate of blind people show that learning to read and write provides the visually impaired with up to 80% more opportunities of finding a job. Especially in higher educational levels, where the ability to digest very complex text is key, accessibility and availability of Braille plays a fundamental role in reducing drop-out rate of the visually impaired, thus affecting the effectiveness of the constitutional right to get access to education. In this context, the Braille Lab project aims at overcoming these social needs by including affordability in designing and developing assistive tools for visually impaired people. In detail, our awarded project focuses on a technology innovation of the operation principle of existing assistive tools for the visually impaired leaving the Human-Machine Interface unchanged. This can result in a significant reduction of the production costs and consequently of tool selling prices, thus representing an important opportunity for social entrepreneurship. The first two assistive tools designed within the Braille Lab project following the proposed approach aims to provide the possibility to personally print documents and handouts and to read texts written in Braille using refreshable Braille display, respectively. The former, named ‘Braille Cartridge’, represents an alternative solution for printing in Braille and consists in the realization of an electronic-controlled dispenser printing (cartridge) which can be integrated within traditional ink-jet printers, in order to leverage the efficiency and cost of the device mechanical structure which are already being used. The latter, named ‘Braille Cursor’, is an innovative Braille display featuring a substantial technology innovation by means of a unique cursor virtualizing Braille cells, thus limiting the number of active pins needed for Braille characters.

Keywords: Human rights, social challenges and technology innovations, visually impaired, affordability, assistive tools

Procedia PDF Downloads 237
15 Utilization of Functionalized Biochar from Water Hyacinth (Eichhornia crassipes) as Green Nano-Fertilizers

Authors: Adewale Tolulope Irewale, Elias Emeka Elemike, Christian O. Dimkpa, Emeka Emmanuel Oguzie

Abstract:

As the global population steadily approaches the 10billion mark, the world is currently faced with two major challenges among others – accessing sustainable and clean energy, and food security. Accessing cleaner and sustainable energy sources to drive global economy and technological advancement, and feeding the teeming human population require sustainable, innovative, and smart solutions. To solve the food production problem, producers have relied on fertilizers as a way of improving crop productivity. Commercial inorganic fertilizers, which is employed to boost agricultural food production, however, pose significant ecological sustainability and economic problems including soil and water pollution, reduced input efficiency, development of highly resistant weeds, micronutrient deficiency, soil degradation, and increased soil toxicity. These ecological and sustainability concerns have raised uncertainties about the continued effectiveness of conventional fertilizers. With the application of nanotechnology, plant biomass upcycling offers several advantages in greener energy production and sustainable agriculture through reduction of environmental pollution, increasing soil microbial activity, recycling carbon thereby reducing GHG emission, and so forth. This innovative technology has the potential for a circular economy and creating a sustainable agricultural practice. Nanomaterials have the potential to greatly enhance the quality and nutrient composition of organic biomass which in turn, allows for the conversion of biomass into nanofertilizers that are potentially more efficient. Water hyacinth plant harvested from an inland water at Warri, Delta State Nigeria were air-dried and milled into powder form. The dry biomass were used to prepare biochar at a pre-determined temperature in an oxygen deficient atmosphere. Physicochemical analysis of the resulting biochar was carried out to determine its porosity and general morphology using the Scanning Transmission Electron Microscopy (STEM). The functional groups (-COOH, -OH, -NH2, -CN, -C=O) were assessed using the Fourier Transform InfraRed Spectroscopy (FTIR) while the heavy metals (Cr, Cu, Fe, Pb, Mg, Mn) were analyzed using Inductively Coupled Plasma – Optical Emission Spectrometry (ICP-OES). Impregnation of the biochar with nanonutrients were achieved under varied conditions of pH, temperature, nanonutrient concentrations and resident time to achieve optimum adsorption. Adsorption and desorption studies were carried out on the resulting nanofertilizer to determine kinetics for the potential nutrients’ bio-availability to plants when used as green fertilizers. Water hyacinth (Eichhornia crassipes) which is an aggressively invasive aquatic plant known for its rapid growth and profusion is being examined in this research to harness its biomass as a sustainable feedstock to formulate functionalized nano-biochar fertilizers, offering various benefits including water hyacinth biomass upcycling, improved nutrient delivery to crops and aquatic ecosystem remediation. Altogether, this work aims to create output values in the three dimensions of environmental, economic, and social benefits.

Keywords: biochar-based nanofertilizers, eichhornia crassipes, greener agriculture, sustainable ecosystem, water hyacinth

Procedia PDF Downloads 26
14 Multiaxial Stress Based High Cycle Fatigue Model for Adhesive Joint Interfaces

Authors: Martin Alexander Eder, Sergei Semenov

Abstract:

Many glass-epoxy composite structures, such as large utility wind turbine rotor blades (WTBs), comprise of adhesive joints with typically thick bond lines used to connect the different components during assembly. Performance optimization of rotor blades to increase power output by simultaneously maintaining high stiffness-to-low-mass ratios entails intricate geometries in conjunction with complex anisotropic material behavior. Consequently, adhesive joints in WTBs are subject to multiaxial stress states with significant stress gradients depending on the local joint geometry. Moreover, the dynamic aero-elastic interaction of the WTB with the airflow generates non-proportional, variable amplitude stress histories in the material. Empiricism shows that a prominent failure type in WTBs is high cycle fatigue failure of adhesive bond line interfaces, which in fact over time developed into a design driver as WTB sizes increase rapidly. Structural optimization employed at an early design stage, therefore, sets high demands on computationally efficient interface fatigue models capable of predicting the critical locations prone for interface failure. The numerical stress-based interface fatigue model presented in this work uses the Drucker-Prager criterion to compute three different damage indices corresponding to the two interface shear tractions and the outward normal traction. The two-parameter Drucker-Prager model was chosen because of its ability to consider shear strength enhancement under compression and shear strength reduction under tension. The governing interface damage index is taken as the maximum of the triple. The damage indices are computed through the well-known linear Palmgren-Miner rule after separate rain flow-counting of the equivalent shear stress history and the equivalent pure normal stress history. The equivalent stress signals are obtained by self-similar scaling of the Drucker-Prager surface whose shape is defined by the uniaxial tensile strength and the shear strength such that it intersects with the stress point at every time step. This approach implicitly assumes that the damage caused by the prevailing multiaxial stress state is the same as the damage caused by an amplified equivalent uniaxial stress state in the three interface directions. The model was implemented as Python plug-in for the commercially available finite element code Abaqus for its use with solid elements. The model was used to predict the interface damage of an adhesively bonded, tapered glass-epoxy composite cantilever I-beam tested by LM Wind Power under constant amplitude compression-compression tip load in the high cycle fatigue regime. Results show that the model was able to predict the location of debonding in the adhesive interface between the webfoot and the cap. Moreover, with a set of two different constant life diagrams namely in shear and tension, it was possible to predict both the fatigue lifetime and the failure mode of the sub-component with reasonable accuracy. It can be concluded that the fidelity, robustness and computational efficiency of the proposed model make it especially suitable for rapid fatigue damage screening of large 3D finite element models subject to complex dynamic load histories.

Keywords: adhesive, fatigue, interface, multiaxial stress

Procedia PDF Downloads 140
13 Oxidation Behavior of Ferritic Stainless Steel Interconnects Modified Using Nanoparticles of Rare-Earth Elements under Operating Conditions Specific to Solid Oxide Electrolyzer Cells

Authors: Łukasz Mazur, Kamil Domaradzki, Bartosz Kamecki, Justyna Ignaczak, Sebastian Molin, Aleksander Gil, Tomasz Brylewski

Abstract:

The rising global power consumption necessitates the development of new energy storage solutions. Prospective technologies include solid oxide electrolyzer cells (SOECs), which convert surplus electrical energy into hydrogen. An electrolyzer cell consists of a porous anode, and cathode, and a dense electrolyte. Power output is increased by connecting cells into stacks using interconnects. Interconnects are currently made from high-chromium ferritic steels – for example, Crofer 22 APU – which exhibit high oxidation resistance and a thermal expansion coefficient that is similar to that of electrode materials. These materials have one disadvantage – their area-specific resistance (ASR) gradually increases due to the formation of a Cr₂O₃ scale on their surface as a result of oxidation. The chromia in the scale also reacts with the water vapor present in the reaction media, forming volatile chromium oxyhydroxides, which in turn react with electrode materials and cause their deterioration. The electrochemical efficiency of SOECs thus decreases. To mitigate this, the interconnect surface can be modified with protective-conducting coatings of spinel or other materials. The high prices of SOEC components -especially the Crofer 22 APU- have prevented their widespread adoption. More inexpensive counterparts, therefore, need to be found, and their properties need to be enhanced to make them viable. Candidates include the Nirosta 4016/1,4016 low-chromium ferritic steel with a chromium content of just 16.3 wt%. This steel's resistance to high-temperature oxidation was improved by depositing Gd₂O₃ nanoparticles on its surface via either dip coating or electrolysis. Modification with CeO₂ or Ce₀.₉Y₀.₁O₂ nanoparticles deposited by means of spray pyrolysis was also tested. These methods were selected because of their low cost and simplicity of application. The aim of this study was to investigate the oxidation kinetics of Nirosta 4016/1,4016 modified using the afore-mentioned methods and to subsequently measure the obtained samples' ASR. The samples were oxidized for 100 h in the air as well as air/H₂O and Ar/H₂/H₂O mixtures at 1073 K. Such conditions reflect those found in the anode and cathode operating space during real-life use of SOECs. Phase and chemical composition and the microstructure of oxidation products were determined using XRD and SEM-EDS. ASR was measured over the range of 623-1073 K using a four-point, two-probe DC technique. The results indicate that the applied nanoparticles improve the oxidation resistance and electrical properties of the studied layered systems. The properties of individual systems varied significantly depending on the applied reaction medium. Gd₂O₃ nanoparticles improved oxidation resistance to a greater degree than either CeO₂ or Ce₀.₉Y₀.₁O₂ nanoparticles. On the other hand, the cerium-containing nanoparticles improved electrical properties regardless of the reaction medium. The ASR values of all surface-modified steel samples were below the 0.1 Ω.cm² threshold set for interconnect materials, which was exceeded in the case of the unmodified reference sample. It can be concluded that the applied modifications increased the oxidation resistance of Nirosta 4016/1.4016 to a level that allows its use as SOEC interconnect material. Acknowledgments: Funding of Research project supported by program "Excellence initiative – research university" for the AGH University of Krakow" is gratefully acknowledged (TB).

Keywords: cerium oxide, ferritic stainless steel, gadolinium oxide, interconnect, SOEC

Procedia PDF Downloads 46
12 Circular Tool and Dynamic Approach to Grow the Entrepreneurship of Macroeconomic Metabolism

Authors: Maria Areias, Diogo Simões, Ana Figueiredo, Anishur Rahman, Filipa Figueiredo, João Nunes

Abstract:

It is expected that close to 7 billion people will live in urban areas by 2050. In order to improve the sustainability of the territories and its transition towards circular economy, it’s necessary to understand its metabolism and promote and guide the entrepreneurship answer. The study of a macroeconomic metabolism involves the quantification of the inputs, outputs and storage of energy, water, materials and wastes for an urban region. This quantification and analysis representing one opportunity for the promotion of green entrepreneurship. There are several methods to assess the environmental impacts of an urban territory, such as human and environmental risk assessment (HERA), life cycle assessment (LCA), ecological footprint assessment (EF), material flow analysis (MFA), physical input-output table (PIOT), ecological network analysis (ENA), multicriteria decision analysis (MCDA) among others. However, no consensus exists about which of those assessment methods are best to analyze the sustainability of these complex systems. Taking into account the weaknesses and needs identified, the CiiM - Circular Innovation Inter-Municipality project aims to define an uniform and globally accepted methodology through the integration of various methodologies and dynamic approaches to increase the efficiency of macroeconomic metabolisms and promoting entrepreneurship in a circular economy. The pilot territory considered in CiiM project has a total area of 969,428 ha, comprising a total of 897,256 inhabitants (about 41% of the population of the Center Region). The main economic activities in the pilot territory, which contribute to a gross domestic product of 14.4 billion euros, are: social support activities for the elderly; construction of buildings; road transport of goods, retailing in supermarkets and hypermarkets; mass production of other garments; inpatient health facilities; and the manufacture of other components and accessories for motor vehicles. The region's business network is mostly constituted of micro and small companies (similar to the Central Region of Portugal), with a total of 53,708 companies identified in the CIM Region of Coimbra (39 large companies), 28,146 in the CIM Viseu Dão Lafões (22 large companies) and 24,953 in CIM Beiras and Serra da Estrela (13 large companies). For the construction of the database was taking into account data available at the National Institute of Statistics (INE), General Directorate of Energy and Geology (DGEG), Eurostat, Pordata, Strategy and Planning Office (GEP), Portuguese Environment Agency (APA), Commission for Coordination and Regional Development (CCDR) and Inter-municipal Community (CIM), as well as dedicated databases. In addition to the collection of statistical data, it was necessary to identify and characterize the different stakeholder groups in the pilot territory that are relevant to the different metabolism components under analysis. The CIIM project also adds the potential of a Geographic Information System (GIS) so that it is be possible to obtain geospatial results of the territorial metabolisms (rural and urban) of the pilot region. This platform will be a powerful visualization tool of flows of products/services that occur within the region and will support the stakeholders, improving their circular performance and identifying new business ideas and symbiotic partnerships.

Keywords: circular economy tools, life cycle assessment macroeconomic metabolism, multicriteria decision analysis, decision support tools, circular entrepreneurship, industrial and regional symbiosis

Procedia PDF Downloads 65
11 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 110
10 Modelling Farmer’s Perception and Intention to Join Cashew Marketing Cooperatives: An Expanded Version of the Theory of Planned Behaviour

Authors: Gospel Iyioku, Jana Mazancova, Jiri Hejkrlik

Abstract:

The “Agricultural Promotion Policy (2016–2020)” represents a strategic initiative by the Nigerian government to address domestic food shortages and the challenges in exporting products at the required quality standards. Hindered by an inefficient system for setting and enforcing food quality standards, coupled with a lack of market knowledge, the Federal Ministry of Agriculture and Rural Development (FMARD) aims to enhance support for the production and activities of key crops like cashew. By collaborating with farmers, processors, investors, and stakeholders in the cashew sector, the policy seeks to define and uphold high-quality standards across the cashew value chain. Given the challenges and opportunities faced by Nigerian cashew farmers, active participation in cashew marketing groups becomes imperative. These groups serve as essential platforms for farmers to collectively navigate market intricacies, access resources, share knowledge, improve output quality, and bolster their overall bargaining power. Through engagement in these cooperative initiatives, farmers not only boost their economic prospects but can also contribute significantly to the sustainable growth of the cashew industry, fostering resilience and community development. This study explores the perceptions and intentions of farmers regarding their involvement in cashew marketing cooperatives, utilizing an expanded version of the Theory of Planned Behaviour. Drawing insights from a diverse sample of 321 cashew farmers in Southwest Nigeria, the research sheds light on the factors influencing decision-making in cooperative participation. The demographic analysis reveals a diverse landscape, with a substantial presence of middle-aged individuals contributing significantly to the agricultural sector and cashew-related activities emerging as a primary income source for a substantial proportion (23.99%). Employing Structural Equation Modelling (SEM) with Maximum Likelihood Robust (MLR) estimation in R, the research elucidates the associations among latent variables. Despite the model’s complexity, the goodness-of-fit indices attest to the validity of the structural model, explaining approximately 40% of the variance in the intention to join cooperatives. Moral norms emerge as a pivotal construct, highlighting the profound influence of ethical considerations in decision-making processes, while perceived behavioural control presents potential challenges in active participation. Attitudes toward joining cooperatives reveal nuanced perspectives, with strong beliefs in enhanced connections with other farmers but varying perceptions on improved access to essential information. The SEM analysis establishes positive and significant effects of moral norms, perceived behavioural control, subjective norms, and attitudes on farmers’ intention to join cooperatives. The knowledge construct positively affects key factors influencing intention, emphasizing the importance of informed decision-making. A supplementary analysis using partial least squares (PLS) SEM corroborates the robustness of our findings, aligning with covariance-based SEM results. This research unveils the determinants of cooperative participation and provides valuable insights for policymakers and practitioners aiming to empower and support this vital demographic in the cashew industry.

Keywords: marketing cooperatives, theory of planned behaviour, structural equation modelling, cashew farmers

Procedia PDF Downloads 34
9 Evaluation of Coal Quality and Geomechanical Moduli Using Core and Geophysical Logs: Study from Middle Permian Barakar Formation of Gondwana Coalfield

Authors: Joyjit Dey, Souvik Sen

Abstract:

Middle Permian Barakar formation is the major economic coal bearing unit of vast east-west trending Damodar Valley basin of Gondwana coalfield. Primary sedimentary structures were studied from the core holes, which represent majorly four facies groups: sandstone dominated facies, sandstone-shale heterolith facies, shale facies and coal facies. Total eight major coal seams have been identified with the bottom most seam being the thickest. Laterally, continuous coal seams were deposited in the calm and quiet environment of extensive floodplain swamps. Channel sinuosity and lateral channel migration/avulsion results in lateral facies heterogeneity and coal splitting. Geophysical well logs (Gamma-Resistivity-Density logs) have been used to establish the vertical and lateral correlation of various litho units field-wide, which reveals the predominance of repetitive fining upwards cycles. Well log data being a permanent record, offers a strong foundation for generating log based property evaluation and helps in characterization of depositional units in terms of lateral and vertical heterogeneity. Low gamma, high resistivity, low density is the typical coal seam signatures in geophysical logs. Here, we have used a density cutoff of 1.6 g/cc as a primary discriminator of coal and the same has been employed to compute various coal assay parameters, which are ash, fixed carbon, moisture, volatile content, cleat porosity, vitrinite reflectance (VRo%), which were calibrated with the laboratory based measurements. The study shows ash content and VRo% increase from west to east (towards basin margin), while fixed carbon, moisture and volatile content increase towards west, depicting increased coal quality westwards. Seam wise cleat porosity decreases from east to west, this would be an effect of overburden, as overburden pressure increases westward with the deepening of basin causing more sediment packet deposited on the western side of the study area. Coal is a porous, viscoelastic material in which velocity and strain both change nonlinearly with stress, especially for stress applied perpendicular to the bedding plane. Usually, the coal seam has a high velocity contrast relative to its neighboring layers. Despite extensive discussion of the maceral and chemical properties of coal, its elastic characteristics have received comparatively little attention. The measurement of the elastic constants of coal presents many difficulties: sample-to-sample inhomogeneity and fragility and velocity dependence on stress, orientation, humidity, and chemical content. In this study, a conclusive empirical equation VS= 0.80VP-0.86 has been used to model shear velocity from compression velocity. Also the same has been used to compute various geomechanical moduli. Geomech analyses yield a Poisson ratio of 0.348 against coals. Average bulk modulus value is 3.97 GPA, while average shear modulus and Young’s modulus values are coming out as 1.34 and 3.59 GPA respectively. These middle Permian Barakar coals show an average 23.84 MPA uniaxial compressive strength (UCS) with 4.97 MPA cohesive strength and 0.46 as friction coefficient. The output values of log based proximate parameters and geomechanical moduli suggest a medium volatile Bituminous grade for the studied coal seams, which is found in the laboratory based core study as well.

Keywords: core analysis, coal characterization, geophysical log, geo-mechanical moduli

Procedia PDF Downloads 191
8 Clinically-Based Improvement Project Focused on Reducing Risks Associated with Diabetes Insipidus, Syndrome of Inappropriate ADH, and Cerebral Salt Wasting in Paediatric Post-Neurosurgical and Traumatic Brain Injury Patients

Authors: Shreya Saxena, Felix Miller-Molloy, Phillipa Bowen, Greg Fellows, Elizabeth Bowen

Abstract:

Background: Complex fluid balance abnormalities are well-established post-neurosurgery and traumatic brain injury (TBI). The triple-phase response requires fluid management strategies reactive to urine output and sodium homeostasis as patients shift between Diabetes Insipidus (DI) and Syndrome of Inappropriate ADH (SIADH). It was observed, at a tertiary paediatric center, a relatively high prevalence of the above complications within a cohort of paediatric post-neurosurgical and TBI patients. An audit of the clinical practice against set institutional guidelines was undertaken and analyzed to understand why this was occurring. Based on those results, new guidelines were developed with structured educational packages for the specialist teams involved. This was then reaudited, and the findings were compared. Methods: Two independent audits were conducted across two time periods, pre and post guideline change. Primary data was collected retrospectively, including both qualitative and quantitative data sets from the CQUIN neurosurgical database and electronic medical records. All paediatric patients post posterior fossa (PFT) or supratentorial surgery or with a TBI were included. A literature review of evidence-based practice, initial audit data, and stakeholder feedback was used to develop new clinical guidelines and nursing standard operation procedures. Compliance against these newly developed guidelines was re-assessed and a thematic, trend-based analysis of the two sets of results was conducted. Results: Audit-1 January2017-June2018, n=80; Audit-2 January2020-June2021, n=30 (reduced operative capacity due to COVID-19 pandemic). Overall, improvements in the monitoring of both fluid balance and electrolyte trends were demonstrated; 51% vs. 77% and 78% vs. 94%, respectively. The number of clear fluid management plans documented postoperatively also increased (odds ratio of 4), leading to earlier recognition and management of evolving fluid-balance abnormalities. The local paediatric endocrine team was involved in the care of all complex cases and notified sooner for those considered to be developing DI or SIADH (14% to 35%). However, significant Na fluctuations (>12mmol in 24 hours) remained similar – 5 vs six patients – found to be due to complex pituitary hypothalamic pathology – and the recommended adaptive fluid management strategy was still not always used. Qualitative data regarding useability and understanding of fluid-balance abnormalities and the revised guidelines were obtained from health professionals via surveys and discussion in the specialist teams providing care. The feedback highlighted the new guidelines provided a more consistent approach to the post-operative care of these patients and was a better platform for communication amongst the different specialist teams involved. The potential limitation to our study would be the small sample size on which to conduct formal analyses; however, this reflects the population that we were investigating, which we cannot control. Conclusion: The revised clinical guidelines, based on audited data, evidence-based literature review and stakeholder consultations, have demonstrated an improvement in understanding of the neuro-endocrine complications that are possible, as well as increased compliance to post-operative monitoring of fluid balance and electrolytes in this cohort of patients. Emphasis has been placed on preventative rather than treatment of DI and SIADH. Consequently, this has positively impacted patient safety for the center and highlighted the importance of educational awareness and multi-disciplinary team working.

Keywords: post-operative, fluid-balance management, neuro-endocrine complications, paediatric

Procedia PDF Downloads 65
7 Tailoring Piezoelectricity of PVDF Fibers with Voltage Polarity and Humidity in Electrospinning

Authors: Piotr K. Szewczyk, Arkadiusz Gradys, Sungkyun Kim, Luana Persano, Mateusz M. Marzec, Oleksander Kryshtal, Andrzej Bernasik, Sohini Kar-Narayan, Pawel Sajkiewicz, Urszula Stachewicz

Abstract:

Piezoelectric polymers have received great attention in smart textiles, wearables, and flexible electronics. Their potential applications range from devices that could operate without traditional power sources, through self-powering sensors, up to implantable biosensors. Semi-crystalline PVDF is often proposed as the main candidate for industrial-scale applications as it exhibits exceptional energy harvesting efficiency compared to other polymers combined with high mechanical strength and thermal stability. Plenty of approaches have been proposed for obtaining PVDF rich in the desired β-phase with electric polling, thermal annealing, and mechanical stretching being the most prevalent. Electrospinning is a highly tunable technique that provides a one-step process of obtaining highly piezoelectric PVDF fibers without the need for post-treatment. In this study, voltage polarity and relative humidity influence on electrospun PVDF, fibers were investigated with the main focus on piezoelectric β-phase contents and piezoelectric performance. Morphology and internal structure of fibers were investigated using scanning (SEM) and transmission electron microscopy techniques (TEM). Fourier Transform Infrared Spectroscopy (FITR), wide-angle X-ray scattering (WAXS) and differential scanning calorimetry (DSC) were used to characterize the phase composition of electrospun PVDF. Additionally, surface chemistry was verified with X-ray photoelectron spectroscopy (XPS). Piezoelectric performance of individual electrospun PVDF fibers was measured using piezoresponse force microscopy (PFM), and the power output from meshes was analyzed via custom-built equipment. To prepare the solution for electrospinning, PVDF pellets were dissolved in dimethylacetamide and acetone solution in a 1:1 ratio to achieve a 24% solution. Fibers were electrospun with a constant voltage of +/-15kV applied to the stainless steel nozzle with the inner diameter of 0.8mm. The flow rate was kept constant at 6mlh⁻¹. The electrospinning of PVDF was performed at T = 25°C and relative humidity of 30 and 60% for PVDF30+/- and PVDF60+/- samples respectively in the environmental chamber. The SEM and TEM analysis of fibers produced at a lower relative humidity of 30% (PVDF30+/-) showed a smooth surface in opposition to fibers obtained at 60% relative humidity (PVDF60+/-), which had wrinkled surface and additionally internal voids. XPS results confirmed lower fluorine content at the surface of PVDF- fibers obtained by electrospinning with negative voltage polarity comparing to the PVDF+ obtained with positive voltage polarity. Changes in surface composition measured with XPS were found to influence the piezoelectric performance of obtained fibers what was further confirmed by PFM as well as by custom-built fiber-based piezoelectric generator. For PVDF60+/- samples humidity led to an increase of β-phase contents in PVDF fibers as confirmed by FTIR, WAXS, and DSC measurements, which showed almost two times higher concentrations of β-phase. A combination of negative voltage polarity with high relative humidity led to fibers with the highest β-phase contents and the best piezoelectric performance of all investigated samples. This study outlines the possibility to produce electrospun PVDF fibers with tunable piezoelectric performance in a one-step electrospinning process by controlling relative humidity and voltage polarity conditions. Acknowledgment: This research was conducted within the funding from m the Sonata Bis 5 project granted by National Science Centre, No 2015/18/E/ST5/00230, and supported by the infrastructure at International Centre of Electron Microscopy for Materials Science (IC-EM) at AGH University of Science and Technology. The PFM measurements were supported by an STSM Grant from COST Action CA17107.

Keywords: crystallinity, electrospinning, PVDF, voltage polarity

Procedia PDF Downloads 106
6 Critical Factors for Successful Adoption of Land Value Capture Mechanisms – An Exploratory Study Applied to Indian Metro Rail Context

Authors: Anjula Negi, Sanjay Gupta

Abstract:

Paradigms studied inform inadequacies of financial resources, be it to finance metro rails for construction or to meet operational revenues or to derive profits in the long term. Funding sustainability is far and wide for much-needed public transport modes, like urban rail or metro rails, to be successfully operated. India embarks upon a sustainable transport journey and has proposed metro rail systems countrywide. As an emerging economic leader, its fiscal constraints are paramount, and the land value capture (LVC) mechanism provides necessary support and innovation toward development. India’s metro rail policy promotes multiple methods of financing, including private-sector investments and public-private-partnership. The critical question that remains to be addressed is what factors can make such mechanisms work. Globally, urban rail is a revolution noted by many researchers as future mobility. Researchers in this study deep dive by way of literature review and empirical assessments into factors that can lead to the adoption of LVC mechanisms. It is understood that the adoption of LVC methods is in the nascent stages in India. Research posits numerous challenges being faced by metro rail agencies in raising funding and for incremental value capture. A few issues pertaining to land-based financing, inter alia: are long-term financing, inter-institutional coordination, economic/ market suitability, dedicated metro funds, land ownership issues, piecemeal approach to real estate development, property development legal frameworks, etc. The question under probe is what are the parameters that can lead to success in the adoption of land value capture (LVC) as a financing mechanism. This research provides insights into key parameters crucial to the adoption of LVC in the context of Indian metro rails. Researchers have studied current forms of LVC mechanisms at various metro rails of the country. This study is significant as little research is available on the adoption of LVC, which is applicable to the Indian context. Transit agencies, State Government, Urban Local Bodies, Policy makers and think tanks, Academia, Developers, Funders, Researchers and Multi-lateral agencies may benefit from this research to take ahead LVC mechanisms in practice. The study deems it imperative to explore and understand key parameters that impact the adoption of LVC. Extensive literature review and ratification by experts working in the metro rails arena were undertaken to arrive at parameters for the study. Stakeholder consultations in the exploratory factor analysis (EFA) process were undertaken for principal component extraction. 43 seasoned and specialized experts participated in a semi-structured questionnaire to scale the maximum likelihood on each parameter, represented by various types of stakeholders. Empirical data was collected on chosen eighteen parameters, and significant correlation was extracted for output descriptives and inferential statistics. Study findings reveal these principal components as institutional governance framework, spatial planning features, legal frameworks, funding sustainability features and fiscal policy measures. In particular, funding sustainability features highlight sub-variables of beneficiaries to pay and use of multiple revenue options towards success in LVC adoption. Researchers recommend incorporation of these variables during early stage in design and project structuring for success in adoption of LVC. In turn leading to improvements in revenue sustainability of a public transport asset and help in undertaking informed transport policy decisions.

Keywords: Exploratory factor analysis, land value capture mechanism, financing metro rails, revenue sustainability, transport policy

Procedia PDF Downloads 56
5 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 116
4 Flood Risk Management in the Semi-Arid Regions of Lebanon - Case Study “Semi Arid Catchments, Ras Baalbeck and Fekha”

Authors: Essam Gooda, Chadi Abdallah, Hamdi Seif, Safaa Baydoun, Rouya Hdeib, Hilal Obeid

Abstract:

Floods are common natural disaster occurring in semi-arid regions in Lebanon. This results in damage to human life and deterioration of environment. Despite their destructive nature and their immense impact on the socio-economy of the region, flash floods have not received adequate attention from policy and decision makers. This is mainly because of poor understanding of the processes involved and measures needed to manage the problem. The current understanding of flash floods remains at the level of general concepts; most policy makers have yet to recognize that flash floods are distinctly different from normal riverine floods in term of causes, propagation, intensity, impacts, predictability, and management. Flash floods are generally not investigated as a separate class of event but are rather reported as part of the overall seasonal flood situation. As a result, Lebanon generally lacks policies, strategies, and plans relating specifically to flash floods. Main objective of this research is to improve flash flood prediction by providing new knowledge and better understanding of the hydrological processes governing flash floods in the East Catchments of El Assi River. This includes developing rainstorm time distribution curves that are unique for this type of study region; analyzing, investigating, and developing a relationship between arid watershed characteristics (including urbanization) and nearby villages flow flood frequency in Ras Baalbeck and Fekha. This paper discusses different levels of integration approach¬es between GIS and hydrological models (HEC-HMS & HEC-RAS) and presents a case study, in which all the tasks of creating model input, editing data, running the model, and displaying output results. The study area corresponds to the East Basin (Ras Baalbeck & Fakeha), comprising nearly 350 km2 and situated in the Bekaa Valley of Lebanon. The case study presented in this paper has a database which is derived from Lebanese Army topographic maps for this region. Using ArcMap to digitizing the contour lines, streams & other features from the topographic maps. The digital elevation model grid (DEM) is derived for the study area. The next steps in this research are to incorporate rainfall time series data from Arseal, Fekha and Deir El Ahmar stations to build a hydrologic data model within a GIS environment and to combine ArcGIS/ArcMap, HEC-HMS & HEC-RAS models, in order to produce a spatial-temporal model for floodplain analysis at a regional scale. In this study, HEC-HMS and SCS methods were chosen to build the hydrologic model of the watershed. The model then calibrated using flood event that occurred between 7th & 9th of May 2014 which considered exceptionally extreme because of the length of time the flows lasted (15 hours) and the fact that it covered both the watershed of Aarsal and Ras Baalbeck. The strongest reported flood in recent times lasted for only 7 hours covering only one watershed. The calibrated hydrologic model is then used to build the hydraulic model & assessing of flood hazards maps for the region. HEC-RAS Model is used in this issue & field trips were done for the catchments in order to calibrated both Hydrologic and Hydraulic models. The presented models are a kind of flexible procedures for an ungaged watershed. For some storm events it delivers good results, while for others, no parameter vectors can be found. In order to have a general methodology based on these ideas, further calibration and compromising of results on the dependence of many flood events parameters and catchment properties is required.

Keywords: flood risk management, flash flood, semi arid region, El Assi River, hazard maps

Procedia PDF Downloads 457
3 Open Science Philosophy, Research and Innovation

Authors: C.Ardil

Abstract:

Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.

Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data

Procedia PDF Downloads 103
2 Acute Severe Hyponatremia in Patient with Psychogenic Polydipsia, Learning Disability and Epilepsy

Authors: Anisa Suraya Ab Razak, Izza Hayat

Abstract:

Introduction: The diagnosis and management of severe hyponatremia in neuropsychiatric patients present a significant challenge to physicians. Several factors contribute, including diagnostic shadowing and attributing abnormal behavior to intellectual disability or psychiatric conditions. Hyponatraemia is the commonest electrolyte abnormality in the inpatient population, ranging from mild/asymptomatic, moderate to severe levels with life-threatening symptoms such as seizures, coma and death. There are several documented fatal case reports in the literature of severe hyponatremia secondary to psychogenic polydipsia, often diagnosed only in autopsy. This paper presents a case study of acute severe hyponatremia in a neuropsychiatric patient with early diagnosis and admission to intensive care. Case study: A 21-year old Caucasian male with known epilepsy and learning disability was admitted from residential living with generalized tonic-clonic self-terminating seizures after refusing medications for several weeks. Evidence of superficial head injury was detected on physical examination. His laboratory data demonstrated mild hyponatremia (125 mmol/L). Computed tomography imaging of his brain demonstrated no acute bleed or space-occupying lesion. He exhibited abnormal behavior - restlessness, drinking water from bathroom taps, inability to engage, paranoia, and hypersexuality. No collateral history was available to establish his baseline behavior. He was loaded with intravenous sodium valproate and leveritircaetam. Three hours later, he developed vomiting and a generalized tonic-clonic seizure lasting forty seconds. He remained drowsy for several hours and regained minimal recovery of consciousness. A repeat set of blood tests demonstrated profound hyponatremia (117 mmol/L). Outcomes: He was referred to intensive care for peripheral intravenous infusion of 2.7% sodium chloride solution with two-hourly laboratory monitoring of sodium concentration. Laboratory monitoring identified dangerously rapid correction of serum sodium concentration, and hypertonic saline was switched to a 5% dextrose solution to reduce the risk of acute large-volume fluid shifts from the cerebral intracellular compartment to the extracellular compartment. He underwent urethral catheterization and produced 8 liters of urine over 24 hours. Serum sodium concentration remained stable after 24 hours of correction fluids. His GCS recovered to baseline after 48 hours with improvement in behavior -he engaged with healthcare professionals, understood the importance of taking medications, admitted to illicit drug use and drinking massive amounts of water. He was transferred from high-dependency care to ward level and was initiated on multiple trials of anti-epileptics before achieving seizure-free days two weeks after resolution of acute hyponatremia. Conclusion: Psychogenic polydipsia is often found in young patients with intellectual disability or psychiatric disorders. Patients drink large volumes of water daily ranging from ten to forty liters, resulting in acute severe hyponatremia with mortality rates as high as 20%. Poor outcomes are due to challenges faced by physicians in making an early diagnosis and treating acute hyponatremia safely. A low index of suspicion of water intoxication is required in this population, including patients with known epilepsy. Monitoring urine output proved to be clinically effective in aiding diagnosis. Early referral and admission to intensive care should be considered for safe correction of sodium concentration while minimizing risk of fatal complications e.g. central pontine myelinolysis.

Keywords: epilepsy, psychogenic polydipsia, seizure, severe hyponatremia

Procedia PDF Downloads 98
1 Tackling the Decontamination Challenge: Nanorecycling of Plastic Waste

Authors: Jocelyn Doucet, Jean-Philippe Laviolette, Ali Eslami

Abstract:

The end-of-life management and recycling of polymer wastes remains a key environment issue in on-going efforts to increase resource efficiency and attaining GHG emission reduction targets. Half of all the plastics ever produced were made in the last 13 years, and only about 16% of that plastic waste is collected for recycling, while 25% is incinerated, 40% is landfilled, and 19% is unmanaged and leaks in the environment and waterways. In addition to the plastic collection issue, the UN recently published a report on chemicals in plastics, which adds another layer of challenge when integrating recycled content containing toxic products into new products. To tackle these important issues, innovative solutions are required. Chemical recycling of plastics provides new complementary alternatives to the current recycled plastic market by converting waste material into a high value chemical commodity that can be reintegrated in a variety of applications, making the total market size of the output – virgin-like, high value products - larger than the market size of the input – plastic waste. Access to high-quality feedstock also remains a major obstacle, primarily due to material contamination issues. Pyrowave approaches this challenge with its innovative nano-recycling technology, which purifies polymers at the molecular level, removing undesirable contaminants and restoring the resin to its virgin state without having to depolymerise it. This breakthrough approach expands the range of plastics that can be effectively recycled, including mixed plastics with various contaminants such as lead, inorganic pigments, and flame retardants. The technology allows yields below 100ppm, and purity can be adjusted to an infinitesimal level depending on the customer's specifications. The separation of the polymer and contaminants in Pyrowave's nano-recycling process offers the unique ability to customize the solution on targeted additives and contaminants to be removed based on the difference in molecular size. This precise control enables the attainment of a final polymer purity equivalent to virgin resin. The patented process involves dissolving the contaminated material using a specially formulated solvent, purifying the mixture at the molecular level, and subsequently extracting the solvent to yield a purified polymer resin that can directly be reintegrated in new products without further treatment. Notably, this technology offers simplicity, effectiveness, and flexibility while minimizing environmental impact and preserving valuable resources in the manufacturing circuit. Pyrowave has successfully applied this nano-recycling technology to decontaminate polymers and supply purified, high-quality recycled plastics to critical industries, including food-contact compliance. The technology is low-carbon, electrified, and provides 100% traceable resins with properties identical to those of virgin resins. Additionally, the issue of low recycling rates and the limited market for traditionally hard-to-recycle plastic waste has fueled the need for new complementary alternatives. Chemical recycling, such as Pyrowave's microwave depolymerization, presents a sustainable and efficient solution by converting plastic waste into high-value commodities. By employing microwave catalytic depolymerization, Pyrowave enables a truly circular economy of plastics, particularly in treating polystyrene waste to produce virgin-like styrene monomers. This revolutionary approach boasts low energy consumption, high yields, and a reduced carbon footprint. Pyrowave offers a portfolio of sustainable, low-carbon, electric solutions to give plastic waste a second life and paves the way to the new circular economy of plastics. Here, particularly for polystyrene, we show that styrene monomer yields from Pyrowave’s polystyrene microwave depolymerization reactor is 2,2 to 1,5 times higher than that of the thermal conventional pyrolysis. In addition, we provide a detailed understanding of the microwave assisted depolymerization via analyzing the effects of microwave power, pyrolysis time, microwave receptor and temperature on the styrene product yields. Furthermore, we investigate life cycle environmental impact assessment of microwave assisted pyrolysis of polystyrene in commercial-scale production. Finally, it is worth pointing out that Pyrowave is able to treat several tons of polystyrene to produce virgin styrene monomers and manage waste/contaminated polymeric materials as well in a truly circular economy.

Keywords: nanorecycling, nanomaterials, plastic recycling, depolymerization

Procedia PDF Downloads 38