Search results for: David Harper
200 Sediment Transport Monitoring in the Port of Veracruz Expansion Project
Authors: Francisco Liaño-Carrera, José Isaac Ramírez-Macías, David Salas-Monreal, Mayra Lorena Riveron-Enzastiga, Marcos Rangel-Avalos, Adriana Andrea Roldán-Ubando
Abstract:
The construction of most coastal infrastructure developments around the world are usually made considering wave height, current velocities and river discharges; however, little effort has been paid to surveying sediment transport during dredging or the modification to currents outside the ports or marinas during and after the construction. This study shows a complete survey during the construction of one of the largest ports of the Gulf of Mexico. An anchored Acoustic Doppler Current Velocity profiler (ADCP), a towed ADCP and a combination of model outputs were used at the Veracruz port construction in order to describe the hourly sediment transport and current modifications in and out of the new port. Owing to the stability of the system the new port was construction inside Vergara Bay, a low wave energy system with a tidal range of up to 0.40 m. The results show a two-current system pattern within the bay. The north side of the bay has an anticyclonic gyre, while the southern part of the bay shows a cyclonic gyre. Sediment transport trajectories were made every hour using the anchored ADCP, a numerical model and the weekly data obtained from the towed ADCP within the entire bay. The sediment transport trajectories were carefully tracked since the bay is surrounded by coral reef structures which are sensitive to sedimentation rate and water turbidity. The survey shows that during dredging and rock input used to build the wave breaker sediments were locally added (< 2500 m2) and local currents disperse it in less than 4 h. While the river input located in the middle of the bay and the sewer system plant may add more than 10 times this amount during a rainy day or during the tourist season. Finally, the coastal line obtained seasonally with a drone suggests that the southern part of the bay has not been modified by the construction of the new port located in the northern part of the bay, owing to the two subsystem division of the bay.Keywords: Acoustic Doppler Current Profiler, construction around coral reefs, dredging, port construction, sediment transport monitoring,
Procedia PDF Downloads 227199 Characterizing Nasal Microbiota in COVID-19 Patients: Insights from Nanopore Technology and Comparative Analysis
Authors: David Pinzauti, Simon De Jaegher, Maria D'Aguano, Manuele Biazzo
Abstract:
The COVID-19 pandemic has left an indelible mark on global health, leading to a pressing need for understanding the intricate interactions between the virus and the human microbiome. This study focuses on characterizing the nasal microbiota of patients affected by COVID-19, with a specific emphasis on the comparison with unaffected individuals, to shed light on the crucial role of the microbiome in the development of this viral disease. To achieve this objective, Nanopore technology was employed to analyze the bacterial 16s rRNA full-length gene present in nasal swabs collected in Malta between January 2021 and August 2022. A comprehensive dataset consisting of 268 samples (126 SARS-negative samples and 142 SARS-positive samples) was subjected to a comparative analysis using an in-house, custom pipeline. The findings from this study revealed that individuals affected by COVID-19 possess a nasal microbiota that is significantly less diverse, as evidenced by lower α diversity, and is characterized by distinct microbial communities compared to unaffected individuals. The beta diversity analyses were carried out at different taxonomic resolutions. At the phylum level, Bacteroidota was found to be more prevalent in SARS-negative samples, suggesting a potential decrease during the course of viral infection. At the species level, the identification of several specific biomarkers further underscores the critical role of the nasal microbiota in COVID-19 pathogenesis. Notably, species such as Finegoldia magna, Moraxella catarrhalis, and others exhibited relative abundance in SARS-positive samples, potentially serving as significant indicators of the disease. This study presents valuable insights into the relationship between COVID-19 and the nasal microbiota. The identification of distinct microbial communities and potential biomarkers associated with the disease offers promising avenues for further research and therapeutic interventions aimed at enhancing public health outcomes in the context of COVID-19.Keywords: COVID-19, nasal microbiota, nanopore technology, 16s rRNA gene, biomarkers
Procedia PDF Downloads 68198 Global and Domestic Response to Boko Haram Terrorism on Cameroon 2014-2018
Authors: David Nchinda Keming
Abstract:
The present study is focused on both the national and international collective fight against Boko Haram terrorism on Cameroon and the rule played by the Lake Chad Basin Countries (LCBCs) and the global community to suffocate the sect’s activities in the region. Although countries of the Lake Chad Basin include: Cameroon, Chad, Nigeria and Niger others like Benin also joined the course. The justification for the internationalisation of the fight against Boko Haram could be explained by the ecological and international climatic importance of the Lake Chad and the danger posed by the sect not only to the Lake Chad member countries but to global armed, civil servants and the international political economy. The study, therefore, kick start with Cameroon’s reaction to Boko Haram’s terrorist attacks on its territory. It further expounds on Cameroon’s request on bilateral diplomacy from members of the UN Security Council for an international collective support to staple the winds of the challenging sect. The study relies on the hypothesis that Boko Haram advanced terrorism on Cameroon was more challenging to the domestic military intelligence thus forcing the government to seek for bilateral and multilateral international collective support to secure its territory from the powerful sect. This premise is tested internationally via (multilateral cooperation, bilateral response, regional cooperation) and domestically through (solidarity parade, religious discourse, political manifestations, war efforts, the vigilantes and the way forward). To accomplish our study, we made used of the mixed research methodologies to interpret the primary, secondary and tertiary sources consulted. Our results reveal that the collective response was effectively positive justified by the drastic drop in the sect’s operations in Cameroon and the whole LCBCs. Although the sect was incapacitated, terrorism remains an international malaise and Cameroon hosts a fertile ground for terrorists’ activism. Boko Haram was just weakened and not completely defeated and could reappear someday even under a different appellation. Therefore, to absolutely eradicate terrorism in general and Boko Haram in particular, LCBCs must improve their military intelligence on terrorism and continue to collaborate with advanced experienced countries in fighting terrorism.Keywords: Boko Haram, terrorism, domestic, international, response
Procedia PDF Downloads 153197 Prediction of Finned Projectile Aerodynamics Using a Lattice-Boltzmann Method CFD Solution
Authors: Zaki Abiza, Miguel Chavez, David M. Holman, Ruddy Brionnaud
Abstract:
In this paper, the prediction of the aerodynamic behavior of the flow around a Finned Projectile will be validated using a Computational Fluid Dynamics (CFD) solution, XFlow, based on the Lattice-Boltzmann Method (LBM). XFlow is an innovative CFD software developed by Next Limit Dynamics. It is based on a state-of-the-art Lattice-Boltzmann Method which uses a proprietary particle-based kinetic solver and a LES turbulent model coupled with the generalized law of the wall (WMLES). The Lattice-Boltzmann method discretizes the continuous Boltzmann equation, a transport equation for the particle probability distribution function. From the Boltzmann transport equation, and by means of the Chapman-Enskog expansion, the compressible Navier-Stokes equations can be recovered. However to simulate compressible flows, this method has a Mach number limitation because of the lattice discretization. Thanks to this flexible particle-based approach the traditional meshing process is avoided, the discretization stage is strongly accelerated reducing engineering costs, and computations on complex geometries are affordable in a straightforward way. The projectile that will be used in this work is the Army-Navy Basic Finned Missile (ANF) with a caliber of 0.03 m. The analysis will consist in varying the Mach number from M=0.5 comparing the axial force coefficient, normal force slope coefficient and the pitch moment slope coefficient of the Finned Projectile obtained by XFlow with the experimental data. The slope coefficients will be obtained using finite difference techniques in the linear range of the polar curve. The aim of such an analysis is to find out the limiting Mach number value starting from which the effects of high fluid compressibility (related to transonic flow regime) lead the XFlow simulations to differ from the experimental results. This will allow identifying the critical Mach number which limits the validity of the isothermal formulation of XFlow and beyond which a fully compressible solver implementing a coupled momentum-energy equations would be required.Keywords: CFD, computational fluid dynamics, drag, finned projectile, lattice-boltzmann method, LBM, lift, mach, pitch
Procedia PDF Downloads 421196 Blue Finance: A Systematical Review of the Academic Literature on Investment Streams for Marine Conservation
Authors: David Broussard
Abstract:
This review article delves into the realm of marine conservation finance, addressing the inadequacies in current financial streams from the private sector and the underutilization of existing financing mechanisms. The study emphasizes the emerging field of “blue finance”, which contributes to economic growth, improved livelihoods, and marine ecosystem health. The financial burden of marine conservation projects typically falls on philanthropists and governments, contrary to the polluter-pays principle. However, the private sector’s increasing commitment to NetZero and growing environmental and social responsibility goals prompts the need for alternative funding sources for marine conservation initiatives like marine protected areas. The article explores the potential of utilizing several financing mechanisms like carbon credits and other forms of payment for ecosystem services in the marine context, providing a solution to the lack of private funding for marine conservation. The methodology employed involves a systematic and quantitative approach, combining traditional review methods and elements of meta-analysis. A comprehensive search of the years 2000 - 2023, using relevant keywords on the Scopus platform, resulted in a review of 252 articles. The temporal evolution of blue finance studies reveals a significant increase in annual articles from 2010 to 2022, with notable peaks in 2011 and 2022. Marine Policy, Ecosystem Services, and Frontiers in Marine Science are prominent journals in this field. While the majority of articles focus on payment for ecosystem services, there is a growing awareness of the need for holistic approaches in conservation finance. Utilizing bibliometric techniques, the article showcases the dominant share of payment for ecosystem services in the literature with a focus on blue carbon. The classification of articles based on various criteria, including financing mechanisms and conservation types, aids in categorizing and understanding the diversity of research objectives and perspectives in this complex field of marine conservation finance.Keywords: biodiversity offsets, carbon credits, ecosystem services, impact investment, payment for ecosystem services
Procedia PDF Downloads 83195 Fast Estimation of Fractional Process Parameters in Rough Financial Models Using Artificial Intelligence
Authors: Dávid Kovács, Bálint Csanády, Dániel Boros, Iván Ivkovic, Lóránt Nagy, Dalma Tóth-Lakits, László Márkus, András Lukács
Abstract:
The modeling practice of financial instruments has seen significant change over the last decade due to the recognition of time-dependent and stochastically changing correlations among the market prices or the prices and market characteristics. To represent this phenomenon, the Stochastic Correlation Process (SCP) has come to the fore in the joint modeling of prices, offering a more nuanced description of their interdependence. This approach has allowed for the attainment of realistic tail dependencies, highlighting that prices tend to synchronize more during intense or volatile trading periods, resulting in stronger correlations. Evidence in statistical literature suggests that, similarly to the volatility, the SCP of certain stock prices follows rough paths, which can be described using fractional differential equations. However, estimating parameters for these equations often involves complex and computation-intensive algorithms, creating a necessity for alternative solutions. In this regard, the Fractional Ornstein-Uhlenbeck (fOU) process from the family of fractional processes offers a promising path. We can effectively describe the rough SCP by utilizing certain transformations of the fOU. We employed neural networks to understand the behavior of these processes. We had to develop a fast algorithm to generate a valid and suitably large sample from the appropriate process to train the network. With an extensive training set, the neural network can estimate the process parameters accurately and efficiently. Although the initial focus was the fOU, the resulting model displayed broader applicability, thus paving the way for further investigation of other processes in the realm of financial mathematics. The utility of SCP extends beyond its immediate application. It also serves as a springboard for a deeper exploration of fractional processes and for extending existing models that use ordinary Wiener processes to fractional scenarios. In essence, deploying both SCP and fractional processes in financial models provides new, more accurate ways to depict market dynamics.Keywords: fractional Ornstein-Uhlenbeck process, fractional stochastic processes, Heston model, neural networks, stochastic correlation, stochastic differential equations, stochastic volatility
Procedia PDF Downloads 118194 Evaluation and Proposal for Improvement of the Flow Measurement Equipment in the Bellavista Drinking Water System of the City of Azogues
Authors: David Quevedo, Diana Coronel
Abstract:
The present article carries out an evaluation of the drinking water system in the Bellavista sector of the city of Azogues, with the purpose of determining the appropriate equipment to record the actual consumption flows of the inhabitants in said sector. Taking into account that the study area is located in a rural and economically disadvantaged area, there is an urgent need to establish a control system for the consumption of drinking water in order to conserve and manage the vital resource in the best possible way, considering that the water source supplying this sector is approximately 9km away. The research began with the collection of cartographic, demographic, and statistical data of the sector, determining the coverage area, population projection, and a provision that guarantees the supply of drinking water to meet the water needs of the sector's inhabitants. By using hydraulic modeling through the United States Environmental Protection Agency Application for Modeling Drinking Water Distribution Systems EPANET 2.0 software, theoretical hydraulic data were obtained, which were used to design and justify the most suitable measuring equipment for the Bellavista drinking water system. Taking into account a minimum service life of the drinking water system of 30 years, future flow rates were calculated for the design of the macro-measuring device. After analyzing the network, it was evident that the Bellavista sector has an average consumption of 102.87 liters per person per day, but considering that Ecuadorian regulations recommend a provision of 180 liters per person per day for the geographical conditions of the sector, this value was used for the analysis. With all the collected and calculated information, the conclusion was reached that the Bellavista drinking water system needs to have a 125mm electromagnetic macro-measuring device for the first three quinquenniums of its service life and a 150mm diameter device for the following three quinquenniums. The importance of having equipment that provides real and reliable data will allow for the control of water consumption by the population of the sector, measured through micro-measuring devices installed at the entrance of each household, which should match the readings of the macro-measuring device placed after the water storage tank outlet, in order to control losses that may occur due to leaks in the drinking water system or illegal connections.Keywords: macrometer, hydraulics, endowment, water
Procedia PDF Downloads 73193 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function
Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros
Abstract:
The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change
Procedia PDF Downloads 174192 Controlled Digital Lending, Equitable Access to Knowledge and Future Library Services
Authors: Xuan Pang, Alvin L. Lee, Peggy Glatthaar
Abstract:
Libraries across the world have been an innovation engine of creativity and opportunityin many decades. The on-going global epidemiology outbreak and health crisis experience illuminates potential reforms, rethinking beyond traditional library operations and services. Controlled Digital Lending (CDL) is one of the emerging technologies libraries used to deliver information digitally in support of online learning and teachingand make educational materials more affordable and more accessible. CDL became a popular term in the United States of America (USA) as a result of a white paper authored by Kyle K. Courtney (Harvard University) and David Hansen (Duke University). The paper gave the legal groundwork to explore CDL: Fair Use, First Sale Doctrine, and Supreme Court rulings. Library professionals implemented this new technology to fulfill their users’ needs. Three libraries in the state of Florida (University of Florida, Florida Gulf Coast University, and Florida A&M University) started a conversation about how to develop strategies to make CDL work possible at each institution. This paper shares the stories of piloting and initiating a CDL program to ensure students have reliable, affordable access to course materials they need to be successful. Additionally, this paper offers an overview of the emerging trends of Controlled Digital Lending in the USA and demonstrates the development of the CDL platforms, policies, and implementation plans. The paper further discusses challenges and lessons learned and how each institution plans to sustain the program into future library services. The fundamental mission of the library is providing users unrestricted access to library resources regardless of their physical location, disability, health status, or other circumstances. The professional due diligence of librarians, as information professionals, is to makeeducational resources more affordable and accessible.CDL opens a new frontier of library services as a mechanism for library practice to enhance user’s experience of using libraries’ services. Libraries should consider exploring this tool to distribute library resources in an effective and equitable way. This new methodology has potential benefits to libraries and end users.Keywords: controlled digital lending, emerging technologies, equitable access, collaborations
Procedia PDF Downloads 135191 Microscale observations of a gas cell wall rupture in bread dough during baking and confrontation to 2/3D Finite Element simulations of stress concentration
Authors: Kossigan Bernard Dedey, David Grenier, Tiphaine Lucas
Abstract:
Bread dough is often described as a dispersion of gas cells in a continuous gluten/starch matrix. The final bread crumb structure is strongly related to gas cell walls (GCWs) rupture during baking. At the end of proofing and during baking, part of the thinnest GCWs between expanding gas cells is reduced to a gluten film of about the size of a starch granule. When such size is reached gluten and starch granules must be considered as interacting phases in order to account for heterogeneities and appropriately describe GCW rupture. Among experimental investigations carried out to assess GCW rupture, no experimental work was performed to observe the GCW rupture in the baking conditions at GCW scale. In addition, attempts to numerically understand GCW rupture are usually not performed at the GCW scale and often considered GCWs as continuous. The most relevant paper that accounted for heterogeneities dealt with the gluten/starch interactions and their impact on the mechanical behavior of dough film. However, stress concentration in GCW was not discussed. In this study, both experimental and numerical approaches were used to better understand GCW rupture in bread dough during baking. Experimentally, a macro-scope placed in front of a two-chamber device was used to observe the rupture of a real GCW of 200 micrometers in thickness. Special attention was paid in order to mimic baking conditions as far as possible (temperature, gas pressure and moisture). Various differences in pressure between both sides of GCW were applied and different modes of fracture initiation and propagation in GCWs were observed. Numerically, the impact of gluten/starch interactions (cohesion or non-cohesion) and rheological moduli ratio on the mechanical behavior of GCW under unidirectional extension was assessed in 2D/3D. A non-linear viscoelastic and hyperelastic approach was performed to match the finite strain involved in GCW during baking. Stress concentration within GCW was identified. Simulated stresses concentration was discussed at the light of GCW failure observed in the device. The gluten/starch granule interactions and rheological modulus ratio were found to have a great effect on the amount of stress possibly reached in the GCW.Keywords: dough, experimental, numerical, rupture
Procedia PDF Downloads 122190 Saving the Decolonized Subject from Neglected Tropical Diseases: Public Health Campaign and Household-Centred Sanitation in Colonial West Africa, 1900-1960
Authors: Adebisi David Alade
Abstract:
In pre-colonial West Africa, the deadliness of the climate vis-a- vis malaria and other tropical diseases to Europeans turned the region into the “white man’s grave.” Thus, immediately after the partition of Africa in 1885, civilisatrice and mise en valeur not only became a pretext for the establishment of colonial rule; from a medical point of view, the control and possible eradication of disease in the continent emerged as one of the first concerns of the European colonizers. Though geared toward making Africa exploitable, historical evidence suggests that some colonial Water, Sanitation and Hygiene (WASH) policies and projects reduced certain tropical diseases in some West African communities. Exploring some of these disease control interventions by way of historical revisionism, this paper challenges the orthodox interpretation of colonial sanitation and public health measures in West Africa. This paper critiques the deployment of race and class as analytical tools for the study of colonial WASH projects, an exercise which often reduces the complexity and ambiguity of colonialism to the binary of colonizer and the colonized. Since West Africa presently ranks high among regions with Neglected Tropical Diseases (NTDs), it is imperative to decentre colonial racism and economic exploitation in African history in order to give room for Africans to see themselves in other ways. Far from resolving the problem of NTDs by fiat in the region, this study seeks to highlight important blind spots in African colonial history in an attempt to prevent post-colonial African leaders from throwing away the baby with the bath water. As scholars researching colonial sanitation and public health in the continent rarely examine its complex meaning and content, this paper submits that the outright demonization of colonial rule across space and time continues to build ideological wall between the present and the past which not only inhibit fruitful borrowing from colonial administration of West Africa, but also prevents a wide understanding of the challenges of WASH policies and projects in most West African states.Keywords: colonial rule, disease control, neglected tropical diseases, WASH
Procedia PDF Downloads 187189 Re-Examining the Distinction between Odour Nuisance and Health Impact: A Community’s Campaign against Landfill Gas Exposure in Shongweni, South Africa
Authors: Colin David La Grange, Lisa Frost Ramsay
Abstract:
Hydrogen sulphide (H2S) is a minor component of landfill gas, but significant in its distinct odorous quality and its association with landfill-related community complaints. The World Health Organisation (WHO) provides two guidelines for H2S: a health guideline at 150 µg/m3 on a 24-hour average, and a nuisance guideline at 7 µg/m3 on a 30-minute average. Albeit a practical distinction for impact assessment, this paper highlights the danger of the apparent dualism between nuisance and health impact, particularly when it is used to dismiss community concerns of perceived health impacts at low concentrations of H2S, as in the case of a community battle against the impacts of a landfill in Shongweni, KwaZulu-Natal, South Africa. Here community members reported, using a community developed mobile phone application, a range of health symptoms that coincided with, or occurred subsequent to, odour events and localised H2S peaks. Local doctors also documented increased visits for symptoms of respiratory distress, eye and skin irritation, and stress after such odour events. Objectively measured H2S and other pollutant concentrations during these events, however, remained below WHO health guidelines. This case study highlights the importance of the physiological link between the experience of environmental nuisance and overall health and wellbeing, showing these to be less distinct than the WHO guidelines would suggest. The potential mechanisms of impact of an odorous plume, with key constituents at concentrations below traditional health thresholds, on psychologically and/or physiologically sensitised individuals are described. In the case of psychological sensitisation, previously documented mechanisms such as aversive conditioning and odour-triggered panic are relevant. Physiological sensitisation to environmental pollutants, evident as a seemingly disproportionate physical (allergy-type) response to either low concentrations or a short duration exposure of a toxin or toxins, remains extensively examined but still not well understood. The links between a heightened sensitivity to toxic compounds, accumulation of some compounds in the body, and a pre-existing or associated immunological stress disorder are presented as a possible explanation.Keywords: immunological stress disorder, landfill odour, odour nuisance, odour sensitisation, toxin accumulation
Procedia PDF Downloads 120188 The Routine Use of a Negative Pressure Incision Management System in Vascular Surgery: A Case Series
Authors: Hansraj Bookun, Angela Tan, Rachel Xuan, Linheng Zhao, Kejia Wang, Animesh Singla, David Kim, Christopher Loupos
Abstract:
Introduction: Incisional wound complications in vascular surgery patients represent a significant clinical and econometric burden of morbidity and mortality. The objective of this study was to trial the feasibility of applying the Prevena negative pressure incision management system as a routine dressing in patients who had undergone arterial surgery. Conventionally, Prevena has been applied to groin incisions, but this study features applications on multiple wound sites such as the thigh or major amputation stumps. Method: This was a cross-sectional observational, single-centre case series of 12 patients who had undergone major vascular surgery. Their wounds were managed with the Prevena system being applied either intra-operatively or on the first post-operative day. Demographic and operative details were collated as well as the length of stay and complication rates. Results: There were 9 males (75%) with mean age of 66 years and the comorbid burden was as follows: ischaemic heart disease (92%), diabetes (42%), hypertension (100%), stage 4 or greater kidney impairment (17%) and current or ex-smoking (83%). The main indications were acute ischaemia (33%), claudication (25%), and gangrene (17%). There were single instances of an occluded popliteal artery aneurysm, diabetic foot infection, and rest pain. The majority of patients (50%) had hybrid operations with iliofemoral endarterectomies, patch arterioplasties, and further peripheral endovascular treatment. There were 4 complex arterial bypass operations and 2 major amputations. The mean length of stay was 17 ± 10 days, with a range of 4 to 35 days. A single complication, in the form of a lymphocoele, was encountered in the context of an iliofemoral endarterectomy and patch arterioplasty. This was managed conservatively. There were no deaths. Discussion: The Prevena wound management system shows that in conjunction with safe vascular surgery, absolute wound complication rates remain low and that it remains a valuable adjunct in the treatment of vasculopaths.Keywords: wound care, negative pressure, vascular surgery, closed incision
Procedia PDF Downloads 136187 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels
Authors: Joshua Buli, David Pietrowski, Samuel Britton
Abstract:
Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization
Procedia PDF Downloads 84186 Developing Social Responsibility Values in Nascent Entrepreneurs through Role-Play: An Explorative Study of University Students in the United Kingdom
Authors: David W. Taylor, Fernando Lourenço, Carolyn Branston, Paul Tucker
Abstract:
There are an increasing number of students at Universities in the United Kingdom engaging in entrepreneurship role-play to explore business start-up as a career alternative to employment. These role-play activities have been shown to have a positive influence on students’ entrepreneurial intentions. Universities also play a role in developing graduates’ awareness of social responsibility. However, social responsibility is often missing from these entrepreneurship role-plays. It is important that these role-play activities include the development of values that support social responsibility, in-line with those running hybrid, humane and sustainable enterprises, and not simply focus on profit. The Young Enterprise (YE) Start-Up programme is an example of a role-play activity that is gaining in popularity amongst United Kingdom Universities seeking ways to give students insight into a business start-up. A Post-92 University in the North-West of England has adapted the traditional YE Directorship roles (e.g., Marketing Director, Sales Director) by including a Corporate Social Responsibility (CSR) Director in all of the team-based YE Start-Up businesses. The aim for introducing this Directorship was to observe if such a role would help create a more socially responsible value-system within each company and in turn shape business decisions. This paper investigates role-play as a tool to help enterprise educators develop socially responsible attitudes and values in nascent entrepreneurs. A mixed qualitative methodology approach has been used, which includes interviews, role-play, and reflection, to help students develop positive value characteristics through the exploration of unethical and selfish behaviors. The initial findings indicate that role-play helped CSR Directors learn and gain insights into the importance of corporate social responsibility, influenced the values and actions of their YE Start-Ups, and increased the likelihood that if the participants were to launch a business post-graduation, that the intent would be for the business to be socially responsible. These findings help inform educators on how to develop socially responsible nascent entrepreneurs within a traditionally profit orientated business model.Keywords: student entrepreneurship, young enterprise, social responsibility, role-play, values
Procedia PDF Downloads 151185 Pakistan’s Counterinsurgency Operations: A Case Study of Swat
Authors: Arshad Ali
Abstract:
The Taliban insurgency in Swat which started apparently as a social movement in 2004 transformed into an anti-Pakistan Islamist insurgency by joining hands with the Tehrik-e-Taliban Pakistan (TTP) upon its formation in 2007. It quickly spread beyond Swat by 2009 making Swat the second stronghold of TTP after FATA. It prompted the Pakistan military to launch a full-scale counterinsurgency military operation code named Rah-i-Rast to regain the control of Swat. Operation Rah-i-Rast was successful not only in restoring the writ of the State but more importantly in creating a consensus against the spread of Taliban insurgency in Pakistan at political, social and military levels. This operation became a test case for civilian government and military to seek for a sustainable solution combating the TTP insurgency in the north-west of Pakistan. This study analyzes why the counterinsurgency operation Rah-i-Rast was successful and why the previous ones came into failure. The study also explores factors which created consensus against the Taliban insurgency at political and social level as well as reasons which hindered such a consensual approach in the past. The study argues that the previous initiatives failed due to various factors including Pakistan army’s lack of comprehensive counterinsurgency model, weak political will and public support, and states negligence. Also, the initial counterinsurgency policies were ad-hoc in nature fluctuating between military operations and peace deals. After continuous failure, the military revisited its approach to counterinsurgency in the operation Rah-i-Rast. The security forces learnt from their past experiences and developed a pragmatic counterinsurgency model: ‘clear, hold, build, and transfer.’ The military also adopted the population-centric approach to provide security to the local people. This case Study of Swat evaluates the strengths and weaknesses of the Pakistan's counterinsurgency operations as well as peace agreements. It will analyze operation Rah-i-Rast in the light of David Galula’s model of counterinsurgency. Unlike existing literature, the study underscores the bottom up approach adopted by the Pakistan’s military and government by engaging the local population to sustain the post-operation stability in Swat. More specifically, the study emphasizes on the hybrid counterinsurgency model “clear, hold, and build and Transfer” in Swat.Keywords: Insurgency, Counterinsurgency, clear, hold, build, transfer
Procedia PDF Downloads 364184 Explosion Mechanics of Aluminum Plates Subjected to the Combined Effect of Blast Wave and Fragment Impact Loading: A Multicase Computational Modeling Study
Authors: Atoui Oussama, Maazoun Azer, Belkassem Bachir, Pyl Lincy, Lecompte David
Abstract:
For many decades, researchers have been focused on understanding the dynamic behavior of different structures and materials subjected to fragment impact or blast loads separately. The explosion mechanics, as well as the impact physics studies dealing with the numerical modeling of the response of protective structures under the synergistic effect of a blast wave and the impact of fragments, are quite limited in the literature. This article numerically evaluates the nonlinear dynamic behavior and damage mechanisms of Aluminum plates EN AW-1050A- H24 under different combined loading scenarios varied by the sequence of the applied loads using the commercial software LS-DYNA. For one hand, with respect to the terminal ballistic field investigations, a Lagrangian (LAG) formulation is used to evaluate the different failure modes of the target material in case of a fragment impact. On the other hand, with respect to the blast field analysis, an Arbitrary Lagrangian-Eulerian (ALE) formulation is considered to study the fluid-structure interaction (FSI) of the shock wave and the plate in case of a blast loading. Four different loading scenarios are considered: (1) only blast loading, (2) only fragment impact, (3) blast loading followed by a fragment impact and (4) a fragment impact followed by blast loading. From the numerical results, it was observed that when the impact load is applied to the plate prior to the blast load, it suffers more severe damage due to the hole enlargement phenomenon and the effects of crack propagation on the circumference of the damaged zone. Moreover, it was found that the hole from the fragment impact loading was enlarged to about three times in diameter as compared to the diameter of the projectile. The validation of the proposed computational model is based in part on previous experimental data obtained by the authors and in the other part on experimental data obtained from the literature. A good correspondence between the numerical and experimental results is found.Keywords: computational analysis, combined loading, explosion mechanics, hole enlargement phenomenon, impact physics, synergistic effect, terminal ballistic
Procedia PDF Downloads 183183 Biophysical Assessment of the Ecological Condition of Wetlands in the Parkland and Grassland Natural Regions of Alberta, Canada
Authors: Marie-Claude Roy, David Locky, Ermias Azeria, Jim Schieck
Abstract:
It is estimated that up to 70% of the wetlands in the Parkland and Grassland natural regions of Alberta have been lost due to various land-use activities. These losses include ecosystem function and services they once provided. Those wetlands remaining are often embedded in a matrix of human-modified habitats and despite efforts taken to protect them the effects of land-uses on wetland condition and function remain largely unknown. We used biophysical field data and remotely-sensed human footprint data collected at 322 open-water wetlands by the Alberta Biodiversity Monitoring Institute (ABMI) to evaluate the impact of surrounding land use on the physico-chemistry characteristics and plant functional traits of wetlands. Eight physio-chemistry parameters were assessed: wetland water depth, water temperature, pH, salinity, dissolved oxygen, total phosphorus, total nitrogen, and dissolved organic carbon. Three plant functional traits were evaluated: 1) origin (native and non-native), 2) life history (annual, biennial, and perennial), and 3) habitat requirements (obligate-wetland and obligate-upland). Intensity land-use was quantified within a 250-meter buffer around each wetland. Ninety-nine percent of wetlands in the Grassland and Parkland regions of Alberta have land-use activities in their surroundings, with most being agriculture-related. Total phosphorus in wetlands increased with the cover of surrounding agriculture, while salinity, total nitrogen, and dissolved organic carbon were positively associated with the degree of soft-linear (e.g. pipelines, trails) land-uses. The abundance of non-native and annual/biennial plants increased with the amount of agriculture, while urban-industrial land-use lowered abundance of natives, perennials, and obligate wetland plants. Our study suggests that land-use types surrounding wetlands affect the physicochemical and biological conditions of wetlands. This research suggests that reducing human disturbances through reclamation of wetland buffers may enhance the condition and function of wetlands in agricultural landscapes.Keywords: wetlands, biophysical assessment, land use, grassland and parkland natural regions
Procedia PDF Downloads 333182 Occupational Challenges and Adjustment Strategies of Internally Displaced Persons in Abuja, Nigeria
Authors: David Obafemi Adebayo
Abstract:
An occupational challenge has been identified as one of the factors that could cripple set goals and life ambitions of an Internally Displaced Person (IDP). The main thrust of this study is therefore, explore the use of life support/adjustment strategy with a view to repositioning these internally displaced persons in Nigeria in revamping their goals and achieving their life-long ambitions. The study intends to investigate whether there exist, on the basis of gender, religion, years of working experience and educational qualification any significant difference in the occupational challenges and adjustment strategies of IDPs. The study being descriptive of survey type adopted a multi-stage sampling technique to select the minimum of 400 internally displaced persons from IDP camps in Yimitu Village, Waru District in the Federal Capital Territory (FCT), Abuja. The research instrument used for the study was a researcher-designed questionnaire entitled “Questionnaire on Occupational Challenges and Adjustment Strategy of Internally Displaced Persons (QOCASIDPs)”. Eight null hypotheses were tested at 0.05 alpha levels of significance. Frequency counts and percentages, means and rank order, t-test, Analysis of Variance (ANOVA) and Duncan Multiple Range Test (DMRT) (where applicable) were employed to analyze the data. The Study determined whether occupational challenges of internally displaced persons included loss of employment, vocational discrimination, marginalization by employers of labour, isolation due to joblessness, lack of occupational freedom, which were found to be true. The results were discussed in line with the findings. The study established the place of notable adjustment strategies adopted by internally displaced person like engaging in petty trading, sourcing soft loans from NGOs, setting up small-scale businesses in groups, acquiring new skills, engaging in further education, among others. The study established that there was no significant difference in the occupational challenges of IDPs on the basis of years of working experience and highest educational qualifications, though there was significant difference on the basis of gender as well as religion. Based on the findings of the study, recommendations were made.Keywords: internally displaced persons, occupational challenges, adjustment strategies, Abuja-Nigeria
Procedia PDF Downloads 358181 MAOD Is Estimated by Sum of Contributions
Authors: David W. Hill, Linda W. Glass, Jakob L. Vingren
Abstract:
Maximal accumulated oxygen deficit (MAOD), the gold standard measure of anaerobic capacity, is the difference between the oxygen cost of exhaustive severe intensity exercise and the accumulated oxygen consumption (O2; mL·kg–1). In theory, MAOD can be estimated as the sum of independent estimates of the phosphocreatine and glycolysis contributions, which we refer to as PCr+glycolysis. Purpose: The purpose was to test the hypothesis that PCr+glycolysis provides a valid measure of anaerobic capacity in cycling and running. Methods: The participants were 27 women (mean ± SD, age 22 ±1 y, height 165 ± 7 cm, weight 63.4 ± 9.7 kg) and 25 men (age 22 ± 1 y, height 179 ± 6 cm, weight 80.8 ± 14.8 kg). They performed two exhaustive cycling and running tests, at speeds and work rates that were tolerable for ~5 min. The rate of oxygen consumption (VO2; mL·kg–1·min–1) was measured in warmups, in the tests, and during 7 min of recovery. Fingerprick blood samples obtained after exercise were analysed to determine peak blood lactate concentration (PeakLac). The VO2 response in exercise was fitted to a model, with a fast ‘primary’ phase followed by a delayed ‘slow’ component, from which was calculated the accumulated O2 and the excess O2 attributable to the slow component. The VO2 response in recovery was fitted to a model with a fast phase and slow component, sharing a common time delay. Oxygen demand (in mL·kg–1·min–1) was determined by extrapolation from steady-state VO2 in warmups; the total oxygen cost (in mL·kg–1) was determined by multiplying this demand by time to exhaustion and adding the excess O2; then, MAOD was calculated as total oxygen cost minus accumulated O2. The phosphocreatine contribution (area under the fast phase of the post-exercise VO2) and the glycolytic contribution (converted from PeakLac) were summed to give PCr+glycolysis. There was not an interaction effect involving sex, so values for anaerobic capacity were examined using a two-way ANOVA, with repeated measures across method (PCr+glycolysis vs MAOD) and mode (cycling vs running). Results: There was a significant effect only for exercise mode. There was no difference between MAOD and PCr+glycolysis: values were 59 ± 6 mL·kg–1 and 61 ± 8 mL·kg–1 in cycling and 78 ± 7 mL·kg–1 and 75 ± 8 mL·kg–1 in running. Discussion: PCr+glycolysis is a valid measure of anaerobic capacity in cycling and running, and it is as valid for women as for men.Keywords: alactic, anaerobic, cycling, ergometer, glycolysis, lactic, lactate, oxygen deficit, phosphocreatine, running, treadmill
Procedia PDF Downloads 136180 Rhizoremediation of Contaminated Soils in Sub-Saharan Africa: Experimental Insights of Microbe Growth and Effects of Paspalum Spp. for Degrading Hydrocarbons in Soils
Authors: David Adade-Boateng, Benard Fei Baffoe, Colin A. Booth, Michael A. Fullen
Abstract:
Remediation of diesel fuel, oil and grease in contaminated soils obtained from a mine site in Ghana are explored using rhizoremediation technology with different levels of nutrient amendments (i.e. N (nitrogen) in Compost (0.2, 0.5 and 0.8%), Urea (0.2, 0.5 and 0.8%) and Topsoil (0.2, 0.5 and 0.8%)) for a native species. A Ghanaian native grass species, Paspalum spp. from the Poaceae family, indicative across Sub-Saharan Africa, was selected following the development of essential and desirable growth criteria. Vegetative parts of the species were subjected to ten treatments in a Randomized Complete Block Design (RCBD) in three replicates. The plant-associated microbial community was examined in Paspalum spp. An assessment of the influence of Paspalum spp on the abundance and activity of micro-organisms in the rhizosphere revealed a build-up of microbial communities over a three month period. This was assessed using the MPN method, which showed rhizospheric samples from the treatments were significantly different (P <0.05). Multiple comparisons showed how microbial populations built-up in the rhizosphere for the different treatments. Treatments G (0.2% compost), H (0.5% compost) and I (0.8% compost) performed significantly better done other treatments, while treatments D (0.2% topsoil) and F (0.8% topsoil) were insignificant. Furthermore, treatment A (0.2% urea), B (0.5% urea), C (0.8% urea) and E (0.5% topsoil) also performed the same. Residual diesel and oil concentrations (as total petroleum hydrocarbons, TPH and oil and grease) were measured using infra-red spectroscopy and gravimetric methods, respectively. The presence of single species successfully enhanced the removal of hydrocarbons from soil. Paspalum spp. subjected to compost levels (0.5% and 0.8%) and topsoil levels (0.5% and 0.8%) showed significantly lower residual hydrocarbon concentrations compared to those treated with Urea. A strong relationship (p<0.001) between the abundance of hydrocarbon degrading micro-organisms in the rhizosphere and hydrocarbon biodegradation was demonstrated for rhizospheric samples with treatment G (0.2% compost), H (0.5% compost) and I (0.8% compost) (P <0.001). The same level of amendment with 0.8% compost (N-level) can improve the application effectiveness. These findings have wide-reaching implications for the environmental management of soils contaminated by hydrocarbons in Sub-Saharan Africa. However, it is necessary to further investigate the in situ rhizoremediation potential of Paspalum spp. at the field scale.Keywords: rhizoremediation, microbial population, rhizospheric sample, treatments
Procedia PDF Downloads 325179 Micelles Made of Pseudo-Proteins for Solubilization of Hydrophobic Biologicals
Authors: Sophio Kobauri, David Tugushi, Vladimir P. Torchilin, Ramaz Katsarava
Abstract:
Hydrophobic / hydrophilically modified functional polymers are of high interest in modern biomedicine due to their ability to solubilize water-insoluble / poorly soluble (hydrophobic) drugs. Among the many approaches that are being developed in this direction, one of the most effective methods is the use of polymeric micelles (PMs) (micelles formed by amphiphilic block-copolymers) for solubilization of hydrophobic biologicals. For therapeutic purposes, PMs are required to be stable and biodegradable, although quite a few amphiphilic block-copolymers are described capable of forming stable micelles with good solubilization properties. For obtaining micelle-forming block-copolymers, polyethylene glycol (PEG) derivatives are desirable to use as hydrophilic shell because it represents the most popular biocompatible hydrophilic block and various hydrophobic blocks (polymers) can be attached to it. Although the construction of the hydrophobic core, due to the complex requirements and micelles structure development, is the very actual and the main problem for nanobioengineers. Considering the above, our research goal was obtaining biodegradable micelles for the solubilization of hydrophobic drugs and biologicals. For this purpose, we used biodegradable polymers– pseudo-proteins (PPs)(synthesized with naturally occurring amino acids and other non-toxic building blocks, such as fatty diols and dicarboxylic acids) as hydrophobic core since these polymers showed reasonable biodegradation rates and excellent biocompatibility. In the present study, we used the hydrophobic amino acid – L-phenylalanine (MW 4000-8000Da) instead of L-leucine. Amino-PEG (MW 2000Da) was used as hydrophilic fragments for constructing the suitable micelles. The molecular weight of PP (the hydrophobic core of micelle) was regulated by variation of used monomers ratios. Micelles were obtained by dissolving of synthesized amphiphilic polymer in water. The micelle-forming property was tested using dynamic light scattering (Malvern zetasizer NanoZSZEN3600). The study showed that obtaining amphiphilic block-copolymer form stable neutral micelles 100 ± 7 nm in size at 10mg/mL concentration, which is considered as an optimal range for pharmaceutical micelles. The obtained preliminary data allow us to conclude that the obtained micelles are suitable for the delivery of poorly water-soluble drugs and biologicals.Keywords: amino acid – L-phenylalanine, pseudo-proteins, amphiphilic block-copolymers, biodegradable micelles
Procedia PDF Downloads 134178 Improving Rural Access to Specialist Emergency Mental Health Care: Using a Time and Motion Study in the Evaluation of a Telepsychiatry Program
Authors: Emily Saurman, David Lyle
Abstract:
In Australia, a well serviced rural town might have a psychiatrist visit once-a-month with more frequent visits from a psychiatric nurse, but many have no resident access to mental health specialists. Access to specialist care, would not only reduce patient distress and benefit outcomes, but facilitate the effective use of limited resources. The Mental Health Emergency Care-Rural Access Program (MHEC-RAP) was developed to improve access to specialist emergency mental health care in rural and remote communities using telehealth technologies. However, there has been no current benchmark to gauge program efficiency or capacity; to determine whether the program activity is justifiably sufficient. The evaluation of MHEC-RAP used multiple methods and applied a modified theory of access to assess the program and its aim of improved access to emergency mental health care. This was the first evaluation of a telepsychiatry service to include a time and motion study design examining program time expenditure, efficiency, and capacity. The time and motion study analysis was combined with an observational study of the program structure and function to assess the balance between program responsiveness and efficiency. Previous program studies have demonstrated that MHEC-RAP has improved access and is used and effective. The findings from the time and motion study suggest that MHEC-RAP has the capacity to manage increased activity within the current model structure without loss to responsiveness or efficiency in the provision of care. Enhancing program responsiveness and efficiency will also support a claim of the program’s value for money. MHEC-RAP is a practical telehealth solution for improving access to specialist emergency mental health care. The findings from this evaluation have already attracted the attention of other regions in Australia interested in implementing emergency telepsychiatry programs and are now informing the progressive establishment of mental health resource centres in rural New South Wales. Like MHEC-RAP, these centres will provide rapid, safe, and contextually relevant assessments and advice to support local health professionals to manage mental health emergencies in the smaller rural emergency departments. Sharing the application of this methodology and research activity may help to improve access to and future evaluations of telehealth and telepsychiatry services for others around the globe.Keywords: access, emergency, mental health, rural, time and motion
Procedia PDF Downloads 234177 Roboweeder: A Robotic Weeds Killer Using Electromagnetic Waves
Authors: Yahoel Van Essen, Gordon Ho, Brett Russell, Hans-Georg Worms, Xiao Lin Long, Edward David Cooper, Avner Bachar
Abstract:
Weeds reduce farm and forest productivity, invade crops, smother pastures and some can harm livestock. Farmers need to spend a significant amount of money to control weeds by means of biological, chemical, cultural, and physical methods. To solve the global agricultural labor shortage and remove poisonous chemicals, a fully autonomous, eco-friendly, and sustainable weeding technology is developed. This takes the form of a weeding robot, ‘Roboweeder’. Roboweeder includes a four-wheel-drive self-driving vehicle, a 4-DOF robotic arm which is mounted on top of the vehicle, an electromagnetic wave generator (magnetron) which is mounted on the “wrist” of the robotic arm, 48V battery packs, and a control/communication system. Cameras are mounted on the front and two sides of the vehicle. Using image processing and recognition, distinguish types of weeds are detected before being eliminated. The electromagnetic wave technology is applied to heat the individual weeds and clusters dielectrically causing them to wilt and die. The 4-DOF robotic arm was modeled mathematically based on its structure/mechanics, each joint’s load, brushless DC motor and worm gear’ characteristics, forward kinematics, and inverse kinematics. The Proportional-Integral-Differential control algorithm is used to control the robotic arm’s motion to ensure the waveguide aperture pointing to the detected weeds. GPS and machine vision are used to traverse the farm and avoid obstacles without the need of supervision. A Roboweeder prototype has been built. Multiple test trials show that Roboweeder is able to detect, point, and kill the pre-defined weeds successfully although further improvements are needed, such as reducing the “weeds killing” time and developing a new waveguide with a smaller waveguide aperture to avoid killing crops surrounded. This technology changes the tedious, time consuming and expensive weeding processes, and allows farmers to grow more, go organic, and eliminate operational headaches. A patent of this technology is pending.Keywords: autonomous navigation, machine vision, precision heating, sustainable and eco-friendly
Procedia PDF Downloads 252176 Using The Flight Heritage From >150 Electric Propulsion Systems To Design The Next Generation Field Emission Electric Propulsion Thrusters
Authors: David Krejci, Tony Schönherr, Quirin Koch, Valentin Hugonnaud, Lou Grimaud, Alexander Reissner, Bernhard Seifert
Abstract:
In 2018 the NANO thruster became the first Field Emission Electric Propulsion (FEEP) system ever to be verified in space in an In-Orbit Demonstration mission conducted together with Fotec. Since then, 160 additional ENPULSION NANO propulsion systems have been deployed in orbit on 73 different spacecraft across multiple customers and missions. These missions included a variety of different satellite bus sizes ranging from 3U Cubesats to >100kg buses, and different orbits in Low Earth Orbit and Geostationary Earth orbit, providing an abundance of on orbit data for statistical analysis. This large-scale industrialization and flight heritage allows for a holistic way of gathering data from testing, integration and operational phases, deriving lessons learnt over a variety of different mission types, operator approaches, use cases and environments. Based on these lessons learnt a new generation of propulsion systems is developed, addressing key findings from the large NANO heritage and adding new capabilities, including increased resilience, thrust vector steering and increased power and thrust level. Some of these successor products have already been validated in orbit, including the MICRO R3 and the NANO AR3. While the MICRO R3 features increased power and thrust level, the NANO AR3 is a successor of the heritage NANO thruster with added thrust vectoring capability. 5 NANO AR3 have been launched to date on two different spacecraft. This work presents flight telemetry data of ENPULSION NANO systems and onorbit statistical data of the ENPULSION NANO as well as lessons learnt during onorbit operations, customer assembly, integration and testing support and ground test campaigns conducted at different facilities. We discuss how transfer of lessons learnt and operational improvement across independent missions across customers has been accomplished. Building on these learnings and exhaustive heritage, we present the design of the new generation of propulsion systems that increase the power and thrust level of FEEP systems to address larger spacecraft buses.Keywords: FEEP, field emission electric propulsion, electric propulsion, flight heritage
Procedia PDF Downloads 90175 Experimental Quantification of the Intra-Tow Resin Storage Evolution during RTM Injection
Authors: Mathieu Imbert, Sebastien Comas-Cardona, Emmanuelle Abisset-Chavanne, David Prono
Abstract:
Short cycle time Resin Transfer Molding (RTM) applications appear to be of great interest for the mass production of automotive or aeronautical lightweight structural parts. During the RTM process, the two components of a resin are mixed on-line and injected into the cavity of a mold where a fibrous preform has been placed. Injection and polymerization occur simultaneously in the preform inducing evolutions of temperature, degree of cure and viscosity that furthermore affect flow and curing. In order to adjust the processing conditions to reduce the cycle time, it is, therefore, essential to understand and quantify the physical mechanisms occurring in the part during injection. In a previous study, a dual-scale simulation tool has been developed to help determining the optimum injection parameters. This tool allows tracking finely the repartition of the resin and the evolution of its properties during reactive injections with on-line mixing. Tows and channels of the fibrous material are considered separately to deal with the consequences of the dual-scale morphology of the continuous fiber textiles. The simulation tool reproduces the unsaturated area at the flow front, generated by the tow/channel difference of permeability. Resin “storage” in the tows after saturation is also taken into account as it may significantly affect the repartition and evolution of the temperature, degree of cure and viscosity in the part during reactive injections. The aim of the current study is, thanks to experiments, to understand and quantify the “storage” evolution in the tows to adjust and validate the numerical tool. The presented study is based on four experimental repeats conducted on three different types of textiles: a unidirectional Non Crimp Fabric (NCF), a triaxial NCF and a satin weave. Model fluids, dyes and image analysis, are used to study quantitatively, the resin flow in the saturated area of the samples. Also, textiles characteristics affecting the resin “storage” evolution in the tows are analyzed. Finally, fully coupled on-line mixing reactive injections are conducted to validate the numerical model.Keywords: experimental, on-line mixing, high-speed RTM process, dual-scale flow
Procedia PDF Downloads 165174 Optimization of Maintenance of PV Module Arrays Based on Asset Management Strategies: Case of Study
Authors: L. Alejandro Cárdenas, Fernando Herrera, David Nova, Juan Ballesteros
Abstract:
This paper presents a methodology to optimize the maintenance of grid-connected photovoltaic systems, considering the cleaning and module replacement periods based on an asset management strategy. The methodology is based on the analysis of the energy production of the PV plant, the energy feed-in tariff, and the cost of cleaning and replacement of the PV modules, with the overall revenue received being the optimization variable. The methodology is evaluated as a case study of a 5.6 kWp solar PV plant located on the Bogotá campus of the Universidad Nacional de Colombia. The asset management strategy implemented consists of assessing the PV modules through visual inspection, energy performance analysis, pollution, and degradation. Within the visual inspection of the plant, the general condition of the modules and the structure is assessed, identifying dust deposition, visible fractures, and water accumulation on the bottom. The energy performance analysis is performed with the energy production reported by the monitoring systems and compared with the values estimated in the simulation. The pollution analysis is performed using the soiling rate due to dust accumulation, which can be modelled by a black box with an exponential function dependent on historical pollution values. The pollution rate is calculated with data collected from the energy generated during two years in a photovoltaic plant on the campus of the National University of Colombia. Additionally, the alternative of assessing the temperature degradation of the PV modules is evaluated by estimating the cell temperature with parameters such as ambient temperature and wind speed. The medium-term energy decrease of the PV modules is assessed with the asset management strategy by calculating the health index to determine the replacement period of the modules due to degradation. This study proposes a tool for decision making related to the maintenance of photovoltaic systems. The above, projecting the increase in the installation of solar photovoltaic systems in power systems associated with the commitments made in the Paris Agreement for the reduction of CO2 emissions. In the Colombian context, it is estimated that by 2030, 12% of the installed power capacity will be solar PV.Keywords: asset management, PV module, optimization, maintenance
Procedia PDF Downloads 52173 Fostering Creativity in Education Exploring Leadership Perspectives on Systemic Barriers to Innovative Pedagogy
Authors: David Crighton, Kelly Smith
Abstract:
The ability to adopt creative pedagogical approaches is increasingly vital in today’s educational landscape. This study examines the institutional barriers that hinder educators, in the UK, from embracing such innovation, focusing specifically on the experiences and perspectives of educational leaders. Current literature primarily focuses on the challenges that academics and teachers encounter, particularly highlighting how management culture and audit processes negatively affect their ability to be creative in classrooms and lecture theatres. However, this focus leaves a gap in understanding management perspectives, which is crucial for providing a more holistic insight into the challenges encountered in educational settings. To explore this gap, we are conducting semi-structured interviews with senior leaders across various educational contexts, including universities, schools, and further education colleges. This qualitative methodology, combined with thematic analysis, aims to uncover the managerial, financial, and administrative pressures these leaders face in fostering creativity in teaching and supporting professional learning opportunities. Preliminary insights indicate that educational leaders face significant barriers, such as institutional policies, resource limitations, and external performance indicators. These challenges create a restrictive environment that stifles educators' creativity and innovation. Addressing these barriers is essential for empowering staff to adopt more creative pedagogical approaches, ultimately enhancing student engagement and learning outcomes. By alleviating these constraints, educational leaders can cultivate a culture that fosters creativity and flexibility in the classroom. These insights will inform practical recommendations to support institutional change and enhance professional learning opportunities, contributing to a more dynamic educational environment. In conclusion, this study offers a timely exploration of how leadership can influence the pedagogical landscape in a rapidly evolving educational context. The research seeks to highlight the crucial role that educational leaders play in shaping a culture of creativity and adaptability, ensuring that institutions are better equipped to respond to the challenges of contemporary education.Keywords: educational leadership, professional learning, creative pedagogy, marketisation
Procedia PDF Downloads 12172 Detection and Classification Strabismus Using Convolutional Neural Network and Spatial Image Processing
Authors: Anoop T. R., Otman Basir, Robert F. Hess, Eileen E. Birch, Brooke A. Koritala, Reed M. Jost, Becky Luu, David Stager, Ben Thompson
Abstract:
Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. We developed a two-stage method for strabismus detection and classification based on photographs of the face. The first stage detects the presence or absence of strabismus, and the second stage classifies the type of strabismus. The first stage comprises face detection using Haar cascade, facial landmark estimation, face alignment, aligned face landmark detection, segmentation of the eye region, and detection of strabismus using VGG 16 convolution neural networks. Face alignment transforms the face to a canonical pose to ensure consistency in subsequent analysis. Using facial landmarks, the eye region is segmented from the aligned face and fed into a VGG 16 CNN model, which has been trained to classify strabismus. The CNN determines whether strabismus is present and classifies the type of strabismus (exotropia, esotropia, and vertical deviation). If stage 1 detects strabismus, the eye region image is fed into stage 2, which starts with the estimation of pupil center coordinates using mask R-CNN deep neural networks. Then, the distance between the pupil coordinates and eye landmarks is calculated along with the angle that the pupil coordinates make with the horizontal and vertical axis. The distance and angle information is used to characterize the degree and direction of the strabismic eye misalignment. This model was tested on 100 clinically labeled images of children with (n = 50) and without (n = 50) strabismus. The True Positive Rate (TPR) and False Positive Rate (FPR) of the first stage were 94% and 6% respectively. The classification stage has produced a TPR of 94.73%, 94.44%, and 100% for esotropia, exotropia, and vertical deviations, respectively. This method also had an FPR of 5.26%, 5.55%, and 0% for esotropia, exotropia, and vertical deviation, respectively. The addition of one more feature related to the location of corneal light reflections may reduce the FPR, which was primarily due to children with pseudo-strabismus (the appearance of strabismus due to a wide nasal bridge or skin folds on the nasal side of the eyes).Keywords: strabismus, deep neural networks, face detection, facial landmarks, face alignment, segmentation, VGG 16, mask R-CNN, pupil coordinates, angle deviation, horizontal and vertical deviation
Procedia PDF Downloads 93171 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 75