Search results for: automatic weather station
492 The Practice of Low Flow Anesthesia to Reduce Carbon Footprints Sustainability Project
Authors: Ahmed Eid, Amita Gupta
Abstract:
Abstract: Background: Background Medical gases are estimated to contribute to 5% of the carbon footprints produced by hospitals, Desflurane has the largest impact, but all increase significantly when used with N2O admixture. Climate Change Act 2008, we must reduce our carbon emission by 80% of the 1990 baseline by 2050.NHS carbon emissions have reduced by 18.5% (2007-2017). The NHS Long Term Plan has outlined measures to achieve this objective, including a 2% reduction by transforming anaesthetic practices. FGF is an important variable that determines the utilization of inhalational agents and can be tightly controlled by the anaesthetist. Aims and Objectives Environmental safety, Identification of areas of high N20 and different anaesthetic agents used across the St Helier operating theatres and consider improvising on the current practice. Methods: Data was collected from St Helier operating theatres and retrieved daily from Care Station 650 anaesthetic machines. 60 cases were included in the sample. Collected data (average flow rate, amount and type of agent used, duration of surgery, type of surgery, duration, and the total amount of Air, O2 and N2O used. AAGBI impact anaesthesia calculator was used to identify the amount of CO2 produced and also the cost per hour for every pt. Communication via reminder emails to staff emphasized the significance of low-flow anaesthesia and departmental meeting presentations aimed at heightening awareness of LFA, Distribution of AAGBI calculator QR codes in all theatres enables the calculation of volatile anaesthetic consumption and CO2e post each case, facilitating informed environmental impact assessment. Results: A significant reduction in the flow rate use in the 2nd sample was observed, flow rate usage between 0-1L was 60% which means a great reduction of the consumption of volatile anaesthetics and also Co2e. By using LFA we can save money but most importantly we can make our lives much greener and save the planet.Keywords: low flow anesthesia, sustainability project, N₂0, Co2e
Procedia PDF Downloads 68491 Road Accident Blackspot Analysis: Development of Decision Criteria for Accident Blackspot Safety Strategies
Authors: Tania Viju, Bimal P., Naseer M. A.
Abstract:
This study aims to develop a conceptual framework for the decision support system (DSS), that helps the decision-makers to dynamically choose appropriate safety measures for each identified accident blackspot. An accident blackspot is a segment of road where the frequency of accident occurrence is disproportionately greater than other sections on roadways. According to a report by the World Bank, India accounts for the highest, that is, eleven percent of the global death in road accidents with just one percent of the world’s vehicles. Hence in 2015, the Ministry of Road Transport and Highways of India gave prime importance to the rectification of accident blackspots. To enhance road traffic safety and reduce the traffic accident rate, effectively identifying and rectifying accident blackspots is of great importance. This study helps to understand and evaluate the existing methods in accident blackspot identification and prediction that are used around the world and their application in Indian roadways. The decision support system, with the help of IoT, ICT and smart systems, acts as a management and planning tool for the government for employing efficient and cost-effective rectification strategies. In order to develop a decision criterion, several factors in terms of quantitative as well as qualitative data that influence the safety conditions of the road are analyzed. Factors include past accident severity data, occurrence time, light, weather and road conditions, visibility, driver conditions, junction type, land use, road markings and signs, road geometry, etc. The framework conceptualizes decision-making by classifying blackspot stretches based on factors like accident occurrence time, different climatic and road conditions and suggesting mitigation measures based on these identified factors. The decision support system will help the public administration dynamically manage and plan the necessary safety interventions required to enhance the safety of the road network.Keywords: decision support system, dynamic management, road accident blackspots, road safety
Procedia PDF Downloads 144490 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru
Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar
Abstract:
Nowadays, heritage building information modeling (HBIM) is considered an efficient tool to represent and manage information of cultural heritage (CH). The basis of this tool relies on a 3D model generally obtained from a cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired level of development (LOD), level of information (LOI), grade of generation (GOG), as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit, and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings, and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills, and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models families, respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI, and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources since the BIM software used has a free student license.Keywords: cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit
Procedia PDF Downloads 142489 An Exploratory Study to Understand the Economic Opportunities from Climate Change
Authors: Sharvari Parikh
Abstract:
Climate change has always been looked upon as a threat. Increased use of fossil fuels, depletion of bio diversity, certain human activities, rising levels of Greenhouse Gas (GHG) emissions are the factors that have caused climate change. Climate change is creating new risks and aggravating the existing ones. The paper focuses on breaking the stereotypical perception of climate change and draws attention towards the constructive side of it. Researches around the world have concluded that climate change has provided us with many untapped opportunities. The next 15 years will be crucial, as it is in our hands whether we are able to grab these opportunities or just let the situation get worse. The world stands at a stage where we cannot think of making a choice between averting climate change and promoting growth and development. In fact, the solution to climate change itself has got economic opportunities. The data evidences from the paper show how we can create the opportunity to improve the lives of the world’s population at large through structural change which will promote environment friendly investments. Rising Investment in green energy and increased demand of climate friendly products has got ample of employment opportunities. Old technologies and machinery which are employed today lack efficiency and demand huge maintenance because of which we face high production cost. This can be drastically brought down by adaptation of Green technologies which are more accessible and affordable. Overall GDP of the world has been heavily affected in aggravating the problems arising out of increasing weather problems. Shifting to green economy can not only eliminate these costs but also build a sound economy. Accelerating the economy in direction of low-carbon future can lessen the burdens such as subsidies for fossil fuels, several public debts, unemployment, poverty, reduce healthcare expenses etc. It is clear that the world will be dragged into the ‘Darker phase’ if the current trends of fossil fuels and carbon are being consumed. Switching to Green economy is the only way in which we can lift the world from darker phase. Climate change has opened the gates for ‘Green and Clean economy’. It will also bring countries of the world together in achieving the common goal of Green Economy.Keywords: climate change, economic opportunities, green economy, green technology
Procedia PDF Downloads 243488 Evaluation of IMERG Performance at Estimating the Rainfall Properties through Convective and Stratiform Rain Events in a Semi-Arid Region of Mexico
Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez
Abstract:
Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention. However, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem are the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurament final run V06B SPP in a semi-arid region of Mexico, using 4 automatic rain gauges (pluviographs) sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs. for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing large overestimations and underestimations, respectively. The study zone presented 80 to 85 % of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and pluviographs. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation
Procedia PDF Downloads 69487 Disabilities in Railways: Proposed Changes to the Design of Railway Compartments for the Inclusion of Differently Abled Persons
Authors: Bathmajaa Muralisankar
Abstract:
As much as railway station infrastructure designs and ticket-booking norms have been changed to facilitate use by differently abled persons, the railway train compartments themselves have not been made user-friendly for differently abled persons. Owing to safety concerns, dependency on others for their travel, and fear of isolation, differently abled people do not prefer travelling by train. Rather than including a dedicated compartment open only to the differently abled, including the latter with others in the normal compartment (with the proposed modifications discussed here) will make them feel secure and make for an enhanced travel experience for them. This approach also represents the most practical way to include a particular category of people in the mainstream society. Lowering the height of the compartment doors and providing a wider entrance with a ramp will provide easy entry for those using wheelchairs. As well, removing the first two alternate rows and the first two side seats will not only widen the passage and increase seating space but also improve wheelchair turning radius. This will help them travel without having to depend on others. Seating arrangements may be done to accommodate their family members near them instead of isolating the differently abled in a separate compartment. According to present ticket-booking regulations of the Indian Railways, three to four disabled persons may travel without their family or one to two along with their family, and the numbers may be added or reduced. To help visually challenged and hearing-impaired persons, in addition to the provision of special instruments, railings, and textured footpaths and flooring, the seat numbers above the seats may be set in metal or plastic as an outward projection so the visually impaired can touch and feel the numbers. Braille boards may be included at the entrance to the compartment along with seat numbers in the aforementioned projected manner. These seat numbers may be designed as buttons, which when pressed results in an announcement of the seat number in the applicable local language as well as English. Emergency buttons, rather than emergency chains, within the easy reach of disabled passengers will also help them.Keywords: dependency, differently abled, inclusion, mainstream society
Procedia PDF Downloads 254486 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA
Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell
Abstract:
Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis
Procedia PDF Downloads 230485 Review on Implementation of Artificial Intelligence and Machine Learning for Controlling Traffic and Avoiding Accidents
Authors: Neha Singh, Shristi Singh
Abstract:
Accidents involving motor vehicles are more likely to cause serious injuries and fatalities. It also has a host of other perpetual issues, such as the regular loss of life and goods in accidents. To solve these issues, appropriate measures must be implemented, such as establishing an autonomous incident detection system that makes use of machine learning and artificial intelligence. In order to reduce traffic accidents, this article examines the overview of artificial intelligence and machine learning in autonomous event detection systems. The paper explores the major issues, prospective solutions, and use of artificial intelligence and machine learning in road transportation systems for minimising traffic accidents. There is a lot of discussion on additional, fresh, and developing approaches that less frequent accidents in the transportation industry. The study structured the following subtopics specifically: traffic management using machine learning and artificial intelligence and an incident detector with these two technologies. The internet of vehicles and vehicle ad hoc networks, as well as the use of wireless communication technologies like 5G wireless networks and the use of machine learning and artificial intelligence for the planning of road transportation systems, are elaborated. In addition, safety is the primary concern of road transportation. Route optimization, cargo volume forecasting, predictive fleet maintenance, real-time vehicle tracking, and traffic management, according to the review's key conclusions, are essential for ensuring the safety of road transportation networks. In addition to highlighting research trends, unanswered problems, and key research conclusions, the study also discusses the difficulties in applying artificial intelligence to road transport systems. Planning and managing the road transportation system might use the work as a resource.Keywords: artificial intelligence, machine learning, incident detector, road transport systems, traffic management, automatic incident detection, deep learning
Procedia PDF Downloads 112484 Estimation of Snow and Ice Melt Contributions to Discharge from the Glacierized Hunza River Basin, Karakoram, Pakistan
Authors: Syed Hammad Ali, Rijan Bhakta Kayastha, Danial Hashmi, Richard Armstrong, Ahuti Shrestha, Iram Bano, Javed Hassan
Abstract:
This paper presents the results of a semi-distributed modified positive degree-day model (MPDDM) for estimating snow and ice melt contributions to discharge from the glacierized Hunza River basin, Pakistan. The model uses daily temperature data, daily precipitation data, and positive degree day factors for snow and ice melt. The model is calibrated for the period 1995-2001 and validated for 2002-2013, and demonstrates close agreements between observed and simulated discharge with Nash–Sutcliffe Efficiencies of 0.90 and 0.88, respectively. Furthermore, the Weather Research and Forecasting model projected temperature, and precipitation data from 2016-2050 are used for representative concentration pathways RCP4.5 and RCP8.5, and bias correction was done using a statistical approach for future discharge estimation. No drastic changes in future discharge are predicted for the emissions scenarios. The aggregate snow-ice melt contribution is 39% of total discharge in the period 1993-2013. Snow-ice melt contribution ranges from 35% to 63% during the high flow period (May to October), which constitutes 89% of annual discharge; in the low flow period (November to April) it ranges from 0.02% to 17%, which constitutes 11 % of the annual discharge. The snow-ice melt contribution to total discharge will increase gradually in the future and reach up to 45% in 2041-2050. From a sensitivity analysis, it is found that the combination of a 2°C temperature rise and 20% increase in precipitation shows a 10% increase in discharge. The study allows us to evaluate the impact of climate change in such basins and is also useful for the future prediction of discharge to define hydropower potential, inform other water resource management in the area, to understand future changes in snow-ice melt contribution to discharge, and offer a possible evaluation of future water quantity and availability.Keywords: climate variability, future discharge projection, positive degree day, regional climate model, water resource management
Procedia PDF Downloads 290483 Spatial Distribution, Characteristics, and Pollution Risk Assessment of Microplastics in Sediments from Karnaphuli River Estuary, Bangladesh
Authors: Md. Refat Jahan Rakiba, M. Belal Hossaina, Rakesh Kumarc, Md. Akram Ullaha, Sultan Al Nahiand, Nazmun Naher Rimaa, Tasrina Rabia Choudhury, Samia Islam Libaf, Jimmy Yub, Mayeen Uddin Khandakerg, Abdelmoneim Suliemanh, Mohamed Mahmoud Sayedi
Abstract:
Microplastics (MPs) have become an emerging global pollutant due to their wide spread and dispersion and potential threats to marine ecosystems. However, studies on MPs of estuarine and coastal ecosystems of Bangladesh are very limited or not available. In this study, we conducted the first study on the abundance, distribution, characteristics and potential risk assessment of microplastics in the sediment of Karnaphuli River estuary, Bangladesh. Microplastic particles were extracted from sediments of 30 stations along the estuary by density separation, and then enumerated and characterize by using steromicroscope and Fourier Transform Infrared (FT-IR) spectroscopy. In the collected sediment, the number of MPs varied from 22.29 - 59.5 items kg−1 of dry weight (DW) with an average of 1177 particles kg−1 DW. The mean abundance was higher in the downstream and left bank of estuary where the predominant shape, colour, and size of MPs were films (35%), white (19%), and >5000 μm (19%), respectively. The main polymer types were polyethylene terephthalate, polystyrene, polyethylene, cellulose, and nylon. MPs were found to pose risks (low to high) in the sediment of the estuary, with the highest risk occuring at one station near a sewage outlet, according to the results of risk analyses using the pollution risk index (PRI), polymer risk index (H), contamination factors (CFs), and pollution load index (PLI). The single value index, PLI clearly demonastated that all sampling sites were considerably polluted (as PLI >1) with microplastics. H values showed toxic polymers even in lower proportions possess higher polymeric hazard scores and vice versa. This investigation uncovered new insights on the status of MPs in the sediments of Karnaphuli River estuary, laying the groundwork for future research and control of microplastic pollution and management.Keywords: microplastics, polymers, pollution risk assessment, Karnaphuli esttuary
Procedia PDF Downloads 81482 Development of Wave-Dissipating Block Installation Simulation for Inexperienced Worker Training
Authors: Hao Min Chuah, Tatsuya Yamazaki, Ryosui Iwasawa, Tatsumi Suto
Abstract:
In recent years, with the advancement of digital technology, the movement to introduce so-called ICT (Information and Communication Technology), such as computer technology and network technology, to civil engineering construction sites and construction sites is accelerating. As part of this movement, attempts are being made in various situations to reproduce actual sites inside computers and use them for designing and construction planning, as well as for training inexperienced engineers. The installation of wave-dissipating blocks on coasts, etc., is a type of work that has been carried out by skilled workers based on their years of experience and is one of the tasks that is difficult for inexperienced workers to carry out on site. Wave-dissipating blocks are structures that are designed to protect coasts, beaches, and so on from erosion by reducing the energy of ocean waves. Wave-dissipating blocks usually weigh more than 1 t and are installed by being suspended by a crane, so it would be time-consuming and costly for inexperienced workers to train on-site. In this paper, therefore, a block installation simulator is developed based on Unity 3D, a game development engine. The simulator computes porosity. Porosity is defined as the ratio of the total volume of the wave breaker blocks inside the structure to the final shape of the ideal structure. Using the evaluation of porosity, the simulator can determine how well the user is able to install the blocks. The voxelization technique is used to calculate the porosity of the structure, simplifying the calculations. Other techniques, such as raycasting and box overlapping, are employed for accurate simulation. In the near future, the simulator will install an automatic block installation algorithm based on combinatorial optimization solutions and compare the user-demonstrated block installation and the appropriate installation solved by the algorithm.Keywords: 3D simulator, porosity, user interface, voxelization, wave-dissipating blocks
Procedia PDF Downloads 103481 Formation Flying Design Applied for an Aurora Borealis Monitoring Mission
Authors: Thais Cardoso Franco, Caio Nahuel Sousa Fagonde, Willer Gomes dos Santos
Abstract:
Aurora Borealis is an optical phenomenon composed of luminous events observed in the night skies in the polar regions resulting from disturbances in the magnetosphere due to the impact of solar wind particles with the Earth's upper atmosphere, channeled by the Earth's magnetic field, which causes atmospheric molecules to become excited and emit electromagnetic spectrum, leading to the display of lights in the sky. However, there are still different implications of this phenomenon under study: high intensity auroras are often accompanied by geomagnetic storms that cause blackouts on Earth and impair the transmission of signals from the Global Navigation Satellite Systems (GNSS). Auroras are also known to occur on other planets and exoplanets, so the activity is an indication of active space weather conditions that can aid in learning about the planetary environment. In order to improve understanding of the phenomenon, this research aims to design a satellite formation flying solution for collecting and transmitting data for monitoring aurora borealis in northern hemisphere, an approach that allows studying the event with multipoint data collection in a reduced time interval, in order to allow analysis from the beginning of the phenomenon until its decline. To this end, the ideal number of satellites, the spacing between them, as well as the ideal topology to be used will be analyzed. From an orbital study, approaches from different altitudes, eccentricities and inclinations will also be considered. Given that at large relative distances between satellites in formation, controllers tend to fail, a study on the efficiency of nonlinear adaptive control methods from the point of view of position maintenance and propellant consumption will be carried out. The main orbital perturbations considered in the simulation: non-homogeneity terrestrial, atmospheric drag, gravitational action of the Sun and the Moon, accelerations due to solar radiation pressure and relativistic effects.Keywords: formation flying, nonlinear adaptive control method, aurora borealis, adaptive SDRE method
Procedia PDF Downloads 38480 Resurgence of Influenza A (H1N1) Pdm09 during November 2015 - February 2016, Pakistan
Authors: Nazish Badar
Abstract:
Background: To investigate the epidemic resurgent wave of influenza A (H1N1) pdm09 infections during 2015-16 Influenza season(Nov,15 –Feb,16) we compared epidemiological features of influenza A (H1N1) pdm09 associated hospitalizations and deaths during this period in Pakistan. Methods: Respiratory samples were tested using CDC Real-Time RT-PCR protocols. Demographic and epidemiological data was analyzed using SPSS. Risk ratio was calculated between age groups to compare patients that were hospitalized and died due to influenza A (H1N1) pdm09 during this period. Results: A total of 1970 specimens were analyzed; influenza virus was detected in 494(25%) samples, including 458(93%) Influenza type A and 36(7%) influenza type B viruses. Amongst influenza A viruses, 351(77%) A(H1N1) pdm09 and 107(23%) were A/H3N2. Influenza A(H1N1)pdm09 peaked in January 2016 when 250(54%) of tested patients were positive. The resurgent waves increased hospitalizations due to pdmH1N1 as compared to the rest part of the year. Overall 267(76%) A(H1N1) pdm09 cases were hospitalized. Adults ≥18 years showed the highest relative risk of hospitalization (1.2). Median interval of hospitalization and symptom onset was five days for all age groups. During this period, a total of 34 laboratory-confirmed deaths associated with pandemic influenza A (H1N1) were reported out of 1970 cases, the case fatality rate was 1.72%. the male to female ratio was 2:1in reported deaths. The majority of the deaths during that period occurred in adults ≥18 years of age. Overall median age of the death cases was 42.8 years with underlying medical conditions. The median number of days between symptom onset was two days. The diagnosis upon admission in influenza-associated fatal cases was pneumonia (53%). Acute Respiratory Distress Syndrome 9 (26%), eight out of which (88%) required mechanical ventilation. Conclusions: The present resurgence of pandemic virus cannot be attributed to a single factor. The prolong cold and dry weather, possibility of drift in virus and absence of annual flu vaccination may have played an integrated role in resurfacing of pandemic virus.Keywords: influenza A (H1N1)pdm 09, resurgence, epidemiology, Pakistan
Procedia PDF Downloads 197479 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies
Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan
Abstract:
The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping
Procedia PDF Downloads 98478 Integrating Insulated Concrete Form (ICF) with Solar-Driven Reverse Osmosis Desalination for Building Integrated Energy Storage in Cold Climates
Authors: Amirhossein Eisapour, Mohammad Emamjome Kashan, Alan S. Fung
Abstract:
This research addresses the pressing global challenges of clean energy and water supplies, emphasizing the need for sustainable solutions for the building sector. The research centers on integrating Reverse Osmosis (RO) systems with building energy systems, incorporating Solar Thermal Collectors (STC)/Photovoltaic Thermal (PVT), water-to-water heat pumps, and an Insulated Concrete Form (ICF) based building foundation wall thermal energy storage. The study explores an innovative configuration’s effectiveness in addressing water and heating demands through clean energy sources while addressing ICF-based thermal storage challenges, which could overheat in the cooling season. Analyzing four configurations—STC-ICF, STC-ICF-RO, PVT-ICF, and PVT-ICF-RO, the study conducts a sensitivity analysis on collector area (25% and 50% increase) and weather data (evaluating five Canadian cities, Winnipeg, Toronto, Edmonton, Halifax and Vancouver). Key outcomes highlight the benefits of integrated RO scenarios, showcasing reduced ICF wall temperature, diminished unwanted heat in the cooling season, reduced RO pump consumption and enhanced solar energy production. The STC-ICF-RO and PVT-ICF-RO systems achieved energy savings of 653 kWh and 131 kWh, respectively, in comparison to their non-integrated RO counterparts. Additionally, both systems successfully contributed to lowering the CO2 production level of the energy system. The calculated payback period of STC-ICF-RO (2 years) affirms the proposed systems’ economic viability. Compared to the base system, which does not benefit from the ICF and RO integration with the building energy system, the STC-ICF-RO and PVT-ICF-RO demonstrate a dramatic energy consumption reduction of 20% and 32%, respectively. The sensitivity analysis suggests potential system improvements under specific conditions, especially when implementing the introduced energy system in communities of buildings.Keywords: insulated concrete form, thermal energy storage, reverse osmosis, building energy systems, solar thermal collector, photovoltaic thermal, heat pump
Procedia PDF Downloads 54477 Adaptation Nature-Based Solutions: CBA of Woodlands for Flood Risk Management in the Aire Catchment, UK
Authors: Olivia R. Rendon
Abstract:
More than half of the world population lives in cities, in the UK, for example, 82% of the population was urban by 2013. Cities concentrate valuable and numerous infrastructure and sectors of the national economies. Cities are particularly vulnerable to climate change which will lead to higher damage costs in the future. There is thus a need to develop and invest in adaptation measures for cities to reduce the impact of flooding and other extreme weather events. Recent flood episodes present a significant and growing challenge to the UK and the estimated cost of urban flood damage is 270 million a year for England and Wales. This study aims to carry out cost-benefit analysis (CBA) of a nature-based approach for flood risk management in cities, focusing on the city of Leeds and the wider Aire catchment as a case study. Leeds was chosen as a case study due to its being one of the most flood vulnerable cities in the UK. In Leeds, over 4,500 properties are currently vulnerable to flooding and approximately £450 million of direct damage is estimated for a potential major flood from the River Aire. Leeds is also the second largest Metropolitan District in England with a projected population of 770,000 for 2014. So far the city council has mainly focused its flood risk management efforts on hard infrastructure solutions for the city centre. However, the wider Leeds district is at significant flood risk which could benefit from greener adaptation measures. This study presents estimates of a nature-based adaptation approach for flood risk management in Leeds. This land use management estimate is based on generating costings utilising primary and secondary data. This research contributes findings on the costs of different adaptation measures to flood risk management in a UK city, including the trade-offs and challenges of utilising nature-based solutions. Results also explore the potential implementation of the adaptation measures in the case study and the challenges of data collection and analysis for adaptation in flood risk management.Keywords: green infrastructure, ecosystem services, woodland, adaptation, flood risk
Procedia PDF Downloads 285476 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia
Authors: Nguyen-Thanh Son
Abstract:
Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.Keywords: MODIS, flood, mapping, Cambodia
Procedia PDF Downloads 126475 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry
Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine
Abstract:
The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).Keywords: bottom elevation, MVS, river, SfM
Procedia PDF Downloads 299474 A Comparative Study of the Techno-Economic Performance of the Linear Fresnel Reflector Using Direct and Indirect Steam Generation: A Case Study under High Direct Normal Irradiance
Authors: Ahmed Aljudaya, Derek Ingham, Lin Ma, Kevin Hughes, Mohammed Pourkashanian
Abstract:
Researchers, power companies, and state politicians have given concentrated solar power (CSP) much attention due to its capacity to generate large amounts of electricity whereas overcoming the intermittent nature of solar resources. The Linear Fresnel Reflector (LFR) is a well-known CSP technology type for being inexpensive, having a low land use factor, and suffering from low optical efficiency. The LFR was considered a cost-effective alternative option to the Parabolic Trough Collector (PTC) because of its simplistic design, and this often outweighs its lower efficiency. The LFR has been found to be a promising option for directly producing steam to a thermal cycle in order to generate low-cost electricity, but also it has been shown to be promising for indirect steam generation. The purpose of this important analysis is to compare the annual performance of the Direct Steam Generation (DSG) and Indirect Steam Generation (ISG) of LFR power plants using molten salt and other different Heat Transfer Fluids (HTF) to investigate their technical and economic effects. A 50 MWe solar-only system is examined as a case study for both steam production methods in extreme weather conditions. In addition, a parametric analysis is carried out to determine the optimal solar field size that provides the lowest Levelized Cost of Electricity (LCOE) while achieving the highest technical performance. As a result of optimizing the optimum solar field size, the solar multiple (SM) is found to be between 1.2 – 1.5 in order to achieve as low as 9 Cent/KWh for the direct steam generation of the linear Fresnel reflector. In addition, the power plant is capable of producing around 141 GWh annually and up to 36% of the capacity factor, whereas the ISG produces less energy at a higher cost. The optimization results show that the DSG’s performance overcomes the ISG in producing around 3% more annual energy, 2% lower LCOE, and 28% less capital cost.Keywords: concentrated solar power, levelized cost of electricity, linear Fresnel reflectors, steam generation
Procedia PDF Downloads 111473 Determining Components of Deflection of the Vertical in Owerri West Local Government, Imo State Nigeria Using Least Square Method
Authors: Chukwu Fidelis Ndubuisi, Madufor Michael Ozims, Asogwa Vivian Ndidiamaka, Egenamba Juliet Ngozi, Okonkwo Stephen C., Kamah Chukwudi David
Abstract:
Deflection of the vertical is a quantity used in reducing geodetic measurements related to geoidal networks to the ellipsoidal plane; and it is essential in Geoid modeling processes. Computing the deflection of the vertical component of a point in a given area is necessary in evaluating the standard errors along north-south and east-west direction. Using combined approach for the determination of deflection of the vertical component provides improved result but labor intensive without appropriate method. Least square method is a method that makes use of redundant observation in modeling a given sets of problem that obeys certain geometric condition. This research work is aimed to computing the deflection of vertical component of Owerri West local government area of Imo State using geometric method as field technique. In this method combination of Global Positioning System on static mode and precise leveling observation were utilized in determination of geodetic coordinate of points established within the study area by GPS observation and the orthometric heights through precise leveling. By least square using Matlab programme; the estimated deflections of vertical component parameters for the common station were -0.0286 and -0.0001 arc seconds for the north-south and east-west components respectively. The associated standard errors of the processed vectors of the network were computed. The computed standard errors of the North-south and East-west components were 5.5911e-005 and 1.4965e-004 arc seconds, respectively. Therefore, including the derived component of deflection of the vertical to the ellipsoidal model will yield high observational accuracy since an ellipsoidal model is not tenable due to its far observational error in the determination of high quality job. It is important to include the determined deflection of the vertical component for Owerri West Local Government in Imo State, Nigeria.Keywords: deflection of vertical, ellipsoidal height, least square, orthometric height
Procedia PDF Downloads 209472 Using the Weakest Precondition to Achieve Self-Stabilization in Critical Networks
Authors: Antonio Pizzarello, Oris Friesen
Abstract:
Networks, such as the electric power grid, must demonstrate exemplary performance and integrity. Integrity depends on the quality of both the system design model and the deployed software. Integrity of the deployed software is key, for both the original versions and the many that occur throughout numerous maintenance activity. Current software engineering technology and practice do not produce adequate integrity. Distributed systems utilize networks where each node is an independent computer system. The connections between them is realized via a network that is normally redundantly connected to guarantee the presence of a path between two nodes in the case of failure of some branch. Furthermore, at each node, there is software which may fail. Self-stabilizing protocols are usually present that recognize failure in the network and perform a repair action that will bring the node back to a correct state. These protocols first introduced by E. W. Dijkstra are currently present in almost all Ethernets. Super stabilization protocols capable of reacting to a change in the network topology due to the removal or addition of a branch in the network are less common but are theoretically defined and available. This paper describes how to use the Software Integrity Assessment (SIA) methodology to analyze self-stabilizing software. SIA is based on the UNITY formalism for parallel and distributed programming, which allows the analysis of code for verifying the progress property p leads-to q that describes the progress of all computations starting in a state satisfying p to a state satisfying q via the execution of one or more system modules. As opposed to demonstrably inadequate test and evaluation methods SIA allows the analysis and verification of any network self-stabilizing software as well as any other software that is designed to recover from failure without external intervention of maintenance personnel. The model to be analyzed is obtained by automatic translation of the system code to a transition system that is based on the use of the weakest precondition.Keywords: network, power grid, self-stabilization, software integrity assessment, UNITY, weakest precondition
Procedia PDF Downloads 223471 The Concentration of Selected Cosmogenic and Anthropogenic Radionuclides in the Ground Layer of the Atmosphere (Polar and Mid-Latitudes Regions)
Authors: A. Burakowska, M. Piotrowski, M. Kubicki, H. Trzaskowska, R. Sosnowiec, B. Myslek-Laurikainen
Abstract:
The most important source of atmospheric radioactivity are radionuclides generated as a result of the impact of primary and secondary cosmic radiation, with the nuclei of nitrogen oxygen and carbon in the upper troposphere and lower stratosphere. This creates about thirty radioisotopes of more than twenty elements. For organisms, the four of them are most important: ³H, ⁷Be, ²²Na, ¹⁴C. The natural radionuclides, which are present in Earth crust, also settle on dust and particles of water vapor. By this means, the derivatives of uranium and thorium, and long-life 40K get into the air. ¹³⁷Cs is the most widespread isotope, that is implemented by humans into the environment. To determine the concentration of radionuclides in the atmosphere, high volume air samplers were used, where the aerosol collection took place on a special filter fabric (Petrianov filter tissue FPP-15-1.5). In 2002 the high volume air sampler AZA-1000 was installed at the Polish Polar Observatory of the Polish Academy of Science in Hornsund, Spitsbergen (77°00’N, 15°33’E), designed to operate in all weather conditions of the cold polar region. Since 1991 (with short breaks) the ASS-500 air sampler has been working, which is located in Swider at the Kalinowski Geophysical Observatory of Geophysics Institute of the Polish Academy of Science (52°07’N, 21°15’E). The following results of radionuclides concentrations were obtained from both stations using gamma spectroscopy analysis: ⁷Be, ¹³⁷Cs, ¹³⁴Cs, ²¹⁰Pb, ⁴⁰K. For gamma spectroscopy analysis HPGe (High Purity Germanium) detector were used. These data were compared with each other. The preliminary results gave evidence that radioactivity measured in aerosols is not proportional to the amount of dust for both studied regions. Furthermore, the results indicate annual variability (seasonal fluctuations) as well as a decrease in the average activity of ⁷Be with increasing latitude. The content of ⁷Be in surface air also indicates the relationship with solar activity cycles.Keywords: aerosols, air filters, atmospheric beryllium, environmental radionuclides, gamma spectroscopy, mid-latitude regions radionuclides, polar regions radionuclides, solar cycles
Procedia PDF Downloads 140470 Predictive Spectral Lithological Mapping, Geomorphology and Geospatial Correlation of Structural Lineaments in Bornu Basin, Northeast Nigeria
Authors: Aminu Abdullahi Isyaku
Abstract:
Semi-arid Bornu basin in northeast Nigeria is characterised with flat topography, thick cover sediments and lack of continuous bedrock outcrops discernible for field geology. This paper presents the methodology for the characterisation of neotectonic surface structures and surface lithology in the north-eastern Bornu basin in northeast Nigeria as an alternative approach to field geological mapping using free multispectral Landsat 7 ETM+, SRTM DEM and ASAR Earth Observation datasets. Spectral lithological mapping herein developed utilised spectral discrimination of the surface features identified on Landsat 7 ETM+ images to infer on the lithology using four steps including; computations of band combination images; band ratio images; supervised image classification and inferences of the lithological compositions. Two complementary approaches to lineament mapping are carried out in this study involving manual digitization and automatic lineament extraction to validate the structural lineaments extracted from the Landsat 7 ETM+ image mosaic covering the study. A comparison between the mapped surface lineaments and lineament zones show good geospatial correlation and identified the predominant NE-SW and NW-SE structural trends in the basin. Topographic profiles across different parts of the Bama Beach Ridge palaeoshorelines in the basin appear to show different elevations across the feature. It is determined that most of the drainage systems in the northeastern Bornu basin are structurally controlled with drainage lines terminating against the paleo-lake border and emptying into the Lake Chad mainly arising from the extensive topographic high-stand Bama Beach Ridge palaeoshoreline.Keywords: Bornu Basin, lineaments, spectral lithology, tectonics
Procedia PDF Downloads 139469 Artificial Neural Network Based Model for Detecting Attacks in Smart Grid Cloud
Authors: Sandeep Mehmi, Harsh Verma, A. L. Sangal
Abstract:
Ever since the idea of using computing services as commodity that can be delivered like other utilities e.g. electric and telephone has been floated, the scientific fraternity has diverted their research towards a new area called utility computing. New paradigms like cluster computing and grid computing came into existence while edging closer to utility computing. With the advent of internet the demand of anytime, anywhere access of the resources that could be provisioned dynamically as a service, gave rise to the next generation computing paradigm known as cloud computing. Today, cloud computing has become one of the most aggressively growing computer paradigm, resulting in growing rate of applications in area of IT outsourcing. Besides catering the computational and storage demands, cloud computing has economically benefitted almost all the fields, education, research, entertainment, medical, banking, military operations, weather forecasting, business and finance to name a few. Smart grid is another discipline that direly needs to be benefitted from the cloud computing advantages. Smart grid system is a new technology that has revolutionized the power sector by automating the transmission and distribution system and integration of smart devices. Cloud based smart grid can fulfill the storage requirement of unstructured and uncorrelated data generated by smart sensors as well as computational needs for self-healing, load balancing and demand response features. But, security issues such as confidentiality, integrity, availability, accountability and privacy need to be resolved for the development of smart grid cloud. In recent years, a number of intrusion prevention techniques have been proposed in the cloud, but hackers/intruders still manage to bypass the security of the cloud. Therefore, precise intrusion detection systems need to be developed in order to secure the critical information infrastructure like smart grid cloud. Considering the success of artificial neural networks in building robust intrusion detection, this research proposes an artificial neural network based model for detecting attacks in smart grid cloud.Keywords: artificial neural networks, cloud computing, intrusion detection systems, security issues, smart grid
Procedia PDF Downloads 318468 An Early Attempt of Artificial Intelligence-Assisted Language Oral Practice and Assessment
Authors: Paul Lam, Kevin Wong, Chi Him Chan
Abstract:
Constant practicing and accurate, immediate feedback are the keys to improving students’ speaking skills. However, traditional oral examination often fails to provide such opportunities to students. The traditional, face-to-face oral assessment is often time consuming – attending the oral needs of one student often leads to the negligence of others. Hence, teachers can only provide limited opportunities and feedback to students. Moreover, students’ incentive to practice is also reduced by their anxiety and shyness in speaking the new language. A mobile app was developed to use artificial intelligence (AI) to provide immediate feedback to students’ speaking performance as an attempt to solve the above-mentioned problems. Firstly, it was thought that online exercises would greatly increase the learning opportunities of students as they can now practice more without the needs of teachers’ presence. Secondly, the automatic feedback provided by the AI would enhance students’ motivation to practice as there is an instant evaluation of their performance. Lastly, students should feel less anxious and shy compared to directly practicing oral in front of teachers. Technically, the program made use of speech-to-text functions to generate feedback to students. To be specific, the software analyzes students’ oral input through certain speech-to-text AI engine and then cleans up the results further to the point that can be compared with the targeted text. The mobile app has invited English teachers for the pilot use and asked for their feedback. Preliminary trials indicated that the approach has limitations. Many of the users’ pronunciation were automatically corrected by the speech recognition function as wise guessing is already integrated into many of such systems. Nevertheless, teachers have confidence that the app can be further improved for accuracy. It has the potential to significantly improve oral drilling by giving students more chances to practice. Moreover, they believe that the success of this mobile app confirms the potential to extend the AI-assisted assessment to other language skills, such as writing, reading, and listening.Keywords: artificial Intelligence, mobile learning, oral assessment, oral practice, speech-to-text function
Procedia PDF Downloads 103467 Evaluating Radiative Feedback Mechanisms in Coastal West Africa Using Regional Climate Models
Authors: Akinnubi Rufus Temidayo
Abstract:
Coastal West Africa is highly sensitive to climate variability, driven by complex ocean-atmosphere interactions that shape temperature, precipitation, and extreme weather. Radiative feedback mechanisms—such as water vapor feedback, cloud-radiation interactions, and surface albedo—play a critical role in modulating these patterns. Yet, limited research addresses these feedbacks in climate models specific to West Africa’s coastal zones, creating challenges for accurate climate projections and adaptive planning. This study aims to evaluate the influence of radiative feedbacks on the coastal climate of West Africa by quantifying the effects of water vapor, cloud cover, and sea surface temperature (SST) on the region’s radiative balance. The study uses a regional climate model (RCM) to simulate feedbacks over a 20-year period (2005-2025) with high-resolution data from CORDEX and satellite observations. Key mechanisms investigated include (1) Water Vapor Feedback—the amplifying effect of humidity on warming, (2) Cloud-Radiation Interactions—the impact of cloud cover on radiation balance, especially during the West African Monsoon, and (3) Surface Albedo and Land-Use Changes—effects of urbanization and vegetation on the radiation budget. Preliminary results indicate that radiative feedbacks strongly influence seasonal climate variability in coastal West Africa. Water vapor feedback amplifies dry-season warming, cloud-radiation interactions moderate surface temperatures during monsoon seasons, and SST variations in the Atlantic affect the frequency and intensity of extreme rainfall events. The findings suggest that incorporating these feedbacks into climate planning can strengthen resilience to climate impacts in West African coastal communities. Further research should refine regional models to capture anthropogenic influences like greenhouse gas emissions, guiding sustainable urban and resource planning to mitigate climate risks.Keywords: west africa, radiative, climate, resilence, anthropogenic
Procedia PDF Downloads 9466 A Literature Study on IoT Based Monitoring System for Smart Agriculture
Authors: Sonu Rana, Jyoti Verma, A. K. Gautam
Abstract:
In most developing countries like India, the majority of the population heavily relies on agriculture for their livelihood. The yield of agriculture is heavily dependent on uncertain weather conditions like a monsoon, soil fertility, availability of irrigation facilities and fertilizers as well as support from the government. The agricultural yield is quite less compared to the effort put in due to inefficient agricultural facilities and obsolete farming practices on the one hand and lack of knowledge on the other hand, and ultimately agricultural community does not prosper. It is therefore essential for the farmers to improve their harvest yield by the acquisition of related data such as soil condition, temperature, humidity, availability of irrigation facilities, availability of, manure, etc., and adopt smart farming techniques using modern agricultural equipment. Nowadays, using IOT technology in agriculture is the best solution to improve the yield with fewer efforts and economic costs. The primary focus of this work-related is IoT technology in the agriculture field. By using IoT all the parameters would be monitored by mounting sensors in an agriculture field held at different places, will collect real-time data, and could be transmitted by a transmitting device like an antenna. To improve the system, IoT will interact with other useful systems like Wireless Sensor Networks. IoT is exploring every aspect, so the radio frequency spectrum is getting crowded due to the increasing demand for wireless applications. Therefore, Federal Communications Commission is reallocating the spectrum for various wireless applications. An antenna is also an integral part of the newly designed IoT devices. The main aim is to propose a new antenna structure used for IoT agricultural applications and compatible with this new unlicensed frequency band. The main focus of this paper is to present work related to these technologies in the agriculture field. This also presented their challenges & benefits. It can help in understanding the job of data by using IoT and correspondence advancements in the horticulture division. This will help to motivate and educate the unskilled farmers to comprehend the best bits of knowledge given by the huge information investigation utilizing smart technology.Keywords: smart agriculture, IoT, agriculture technology, data analytics, smart technology
Procedia PDF Downloads 116465 Synthesis of Liposomal Vesicles by a Novel Supercritical Fluid Process
Authors: Wen-Chyan Tsai, Syed S. H. Rizvi
Abstract:
Organic solvent residues are always associated with liposomes produced by the traditional techniques like the thin film hydration and reverse phase evaporation methods, which limit the applications of these vesicles in the pharmaceutical, food and cosmetic industries. Our objective was to develop a novel and benign process of liposomal microencapsulation by using supercritical carbon dioxide (SC-CO2) as the sole phospholipid-dissolving medium and a green substitute for organic solvents. This process consists of supercritical fluid extraction followed by rapid expansion via a nozzle and automatic cargo suction. Lecithin and cholesterol mixed in 10:1 mass ratio were dissolved in SC-CO2 at 20 ± 0.5 MPa and 60 oC. After at least two hours of equilibrium, the lecithin/cholesterol-laden SC-CO2 was passed through a 1000-micron nozzle and immediately mixed with the cargo solution to form liposomes. Liposomal micro-encapsulation was conducted at three pressures (8.27, 12.41, 16.55 MPa), three temperatures (75, 83 and 90 oC) and two flow rates (0.25 ml/sec and 0.5 ml/sec). Liposome size, zeta potential and encapsulation efficiency were characterized as functions of the operating parameters. The average liposomal size varied from 400-500 nm to 1000-1200 nm when the pressure was increased from 8.27 to 16.55 MPa. At 12.41 MPa, 90 oC and 0.25 ml per second of 0.2 M glucose cargo loading rate, the highest encapsulation efficiency of 31.65 % was achieved. Under a confocal laser scanning microscope, large unilamellar vesicles and multivesicular vesicles were observed to make up a majority of the liposomal emulsion. This new approach is a rapid and continuous process for bulk production of liposomes using a green solvent. Based on the results to date, it is feasible to apply this technique to encapsulate hydrophilic compounds inside the aqueous core as well as lipophilic compounds in the phospholipid bilayers of the liposomes for controlled release, solubility improvement and targeted therapy of bioactive compounds.Keywords: liposome, micro encapsulation, supercritical carbon dioxide, non-toxic process
Procedia PDF Downloads 431464 Topographic Characteristics Derived from UAV Images to Detect Ephemeral Gully Channels
Authors: Recep Gundogan, Turgay Dindaroglu, Hikmet Gunal, Mustafa Ulukavak, Ron Bingner
Abstract:
A majority of total soil losses in agricultural areas could be attributed to ephemeral gullies caused by heavy rains in conventionally tilled fields; however, ephemeral gully erosion is often ignored in conventional soil erosion assessments. Ephemeral gullies are often easily filled from normal soil tillage operations, which makes capturing the existing ephemeral gullies in croplands difficult. This study was carried out to determine topographic features, including slope and aspect composite topographic index (CTI) and initiation points of gully channels, using images obtained from unmanned aerial vehicle (UAV) images. The study area was located in Topcu stream watershed in the eastern Mediterranean Region, where intense rainfall events occur over very short time periods. The slope varied between 0.7 and 99.5%, and the average slope was 24.7%. The UAV (multi-propeller hexacopter) was used as the carrier platform, and images were obtained with the RGB camera mounted on the UAV. The digital terrain models (DTM) of Topçu stream micro catchment produced using UAV images and manual field Global Positioning System (GPS) measurements were compared to assess the accuracy of UAV based measurements. Eighty-one gully channels were detected in the study area. The mean slope and CTI values in the micro-catchment obtained from DTMs generated using UAV images were 19.2% and 3.64, respectively, and both slope and CTI values were lower than those obtained using GPS measurements. The total length and volume of the gully channels were 868.2 m and 5.52 m³, respectively. Topographic characteristics and information on ephemeral gully channels (location of initial point, volume, and length) were estimated with high accuracy using the UAV images. The results reveal that UAV-based measuring techniques can be used in lieu of existing GPS and total station techniques by using images obtained with high-resolution UAVs.Keywords: aspect, compound topographic index, digital terrain model, initial gully point, slope, unmanned aerial vehicle
Procedia PDF Downloads 114463 An Attentional Bi-Stream Sequence Learner (AttBiSeL) for Credit Card Fraud Detection
Authors: Amir Shahab Shahabi, Mohsen Hasirian
Abstract:
Modern societies, marked by expansive Internet connectivity and the rise of e-commerce, are now integrated with digital platforms at an unprecedented level. The efficiency, speed, and accessibility of e-commerce have garnered a substantial consumer base. Against this backdrop, electronic banking has undergone rapid proliferation within the realm of online activities. However, this growth has inadvertently given rise to an environment conducive to illicit activities, notably electronic payment fraud, posing a formidable challenge to the domain of electronic banking. A pivotal role in upholding the integrity of electronic commerce and business transactions is played by electronic fraud detection, particularly in the context of credit cards which underscores the imperative of comprehensive research in this field. To this end, our study introduces an Attentional Bi-Stream Sequence Learner (AttBiSeL) framework that leverages attention mechanisms and recurrent networks. By incorporating bidirectional recurrent layers, specifically bidirectional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) layers, the proposed model adeptly extracts past and future transaction sequences while accounting for the temporal flow of information in both directions. Moreover, the integration of an attention mechanism accentuates specific transactions to varying degrees, as manifested in the output of the recurrent networks. The effectiveness of the proposed approach in automatic credit card fraud classification is evaluated on the European Cardholders' Fraud Dataset. Empirical results validate that the hybrid architectural paradigm presented in this study yields enhanced accuracy compared to previous studies.Keywords: credit card fraud, deep learning, attention mechanism, recurrent neural networks
Procedia PDF Downloads 13