Search results for: software comparison
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9378

Search results for: software comparison

18 Designing and Simulation of the Rotor and Hub of the Unmanned Helicopter

Authors: Zbigniew Czyz, Ksenia Siadkowska, Krzysztof Skiba, Karol Scislowski

Abstract:

Today’s progress in the rotorcraft is mostly associated with an optimization of aircraft performance achieved by active and passive modifications of main rotor assemblies and a tail propeller. The key task is to improve their performance, improve the hover quality factor for rotors but not change in specific fuel consumption. One of the tasks to improve the helicopter is an active optimization of the main rotor providing for flight stages, i.e., an ascend, flight, a descend. An active interference with the airflow around the rotor blade section can significantly change characteristics of the aerodynamic airfoil. The efficiency of actuator systems modifying aerodynamic coefficients in the current solutions is relatively high and significantly affects the increase in strength. The solution to actively change aerodynamic characteristics assumes a periodic change of geometric features of blades depending on flight stages. Changing geometric parameters of blade warping enables an optimization of main rotor performance depending on helicopter flight stages. Structurally, an adaptation of shape memory alloys does not significantly affect rotor blade fatigue strength, which contributes to reduce costs associated with an adaptation of the system to the existing blades, and gains from a better performance can easily amortize such a modification and improve profitability of such a structure. In order to obtain quantitative and qualitative data to solve this research problem, a number of numerical analyses have been necessary. The main problem is a selection of design parameters of the main rotor and a preliminary optimization of its performance to improve the hover quality factor for rotors. This design concept assumes a three-bladed main rotor with a chord of 0.07 m and radius R = 1 m. The value of rotor speed is a calculated parameter of an optimization function. To specify the initial distribution of geometric warping, a special software has been created that uses a numerical method of a blade element which respects dynamic design features such as fluctuations of a blade in its joints. A number of performance analyses as a function of rotor speed, forward speed, and altitude have been performed. The calculations were carried out for the full model assembly. This approach makes it possible to observe the behavior of components and their mutual interaction resulting from the forces. The key element of each rotor is the shaft, hub and pins holding the joints and blade yokes. These components are exposed to the highest loads. As a result of the analysis, the safety factor was determined at the level of k > 1.5, which gives grounds to obtain certification for the strength of the structure. The construction of the joint rotor has numerous moving elements in its structure. Despite the high safety factor, the places with the highest stresses, where the signs of wear and tear may appear, have been indicated. The numerical analysis carried out showed that the most loaded element is the pin connecting the modular bearing of the blade yoke with the element of the horizontal oscillation joint. The stresses in this element result in a safety factor of k=1.7. The other analysed rotor components have a safety factor of more than 2 and in the case of the shaft, this factor is more than 3. However, it must be remembered that the structure is as strong as the weakest cell is. Designed rotor for unmanned aerial vehicles adapted to work with blades with intelligent materials in its structure meets the requirements for certification testing. Acknowledgement: This work has been financed by the Polish National Centre for Research and Development under the LIDER program, Grant Agreement No. LIDER/45/0177/L-9/17/NCBR/2018.

Keywords: main rotor, rotorcraft aerodynamics, shape memory alloy, materials, unmanned helicopter

Procedia PDF Downloads 149
17 Machine Learning Based Digitalization of Validated Traditional Cognitive Tests and Their Integration to Multi-User Digital Support System for Alzheimer’s Patients

Authors: Ramazan Bakir, Gizem Kayar

Abstract:

It is known that Alzheimer and Dementia are the two most common types of Neurodegenerative diseases and their visibility is getting accelerated for the last couple of years. As the population sees older ages all over the world, researchers expect to see the rate of this acceleration much higher. However, unfortunately, there is no known pharmacological cure for both, although some help to reduce the rate of cognitive decline speed. This is why we encounter with non-pharmacological treatment and tracking methods more for the last five years. Many researchers, including well-known associations and hospitals, lean towards using non-pharmacological methods to support cognitive function and improve the patient’s life quality. As the dementia symptoms related to mind, learning, memory, speaking, problem-solving, social abilities and daily activities gradually worsen over the years, many researchers know that cognitive support should start from the very beginning of the symptoms in order to slow down the decline. At this point, life of a patient and caregiver can be improved with some daily activities and applications. These activities include but not limited to basic word puzzles, daily cleaning activities, taking notes. Later, these activities and their results should be observed carefully and it is only possible during patient/caregiver and M.D. in-person meetings in hospitals. These meetings can be quite time-consuming, exhausting and financially ineffective for hospitals, medical doctors, caregivers and especially for patients. On the other hand, digital support systems are showing positive results for all stakeholders of healthcare systems. This can be observed in countries that started Telemedicine systems. The biggest potential of our system is setting the inter-user communication up in the best possible way. In our project, we propose Machine Learning based digitalization of validated traditional cognitive tests (e.g. MOCA, Afazi, left-right hemisphere), their analyses for high-quality follow-up and communication systems for all stakeholders. R. Bakir and G. Kayar are with Gefeasoft, Inc, R&D – Software Development and Health Technologies company. Emails: ramazan, gizem @ gefeasoft.com This platform has a high potential not only for patient tracking but also for making all stakeholders feel safe through all stages. As the registered hospitals assign corresponding medical doctors to the system, these MDs are able to register their own patients and assign special tasks for each patient. With our integrated machine learning support, MDs are able to track the failure and success rates of each patient and also see general averages among similarly progressed patients. In addition, our platform also supports multi-player technology which helps patients play with their caregivers so that they feel much safer at any point they are uncomfortable. By also gamifying the daily household activities, the patients will be able to repeat their social tasks and we will provide non-pharmacological reminiscence therapy (RT – life review therapy). All collected data will be mined by our data scientists and analyzed meaningfully. In addition, we will also add gamification modules for caregivers based on Naomi Feil’s Validation Therapy. Both are behaving positively to the patient and keeping yourself mentally healthy is important for caregivers. We aim to provide a therapy system based on gamification for them, too. When this project accomplishes all the above-written tasks, patients will have the chance to do many tasks at home remotely and MDs will be able to follow them up very effectively. We propose a complete platform and the whole project is both time and cost-effective for supporting all stakeholders.

Keywords: alzheimer’s, dementia, cognitive functionality, cognitive tests, serious games, machine learning, artificial intelligence, digitalization, non-pharmacological, data analysis, telemedicine, e-health, health-tech, gamification

Procedia PDF Downloads 127
16 Marketing and Business Intelligence and Their Impact on Products and Services through Understanding Based on Experiential Knowledge of Customers in Telecommunications Companies

Authors: Ali R. Alshawawreh, Francisco Liébana-Cabanillas, Francisco J. Blanco-Encomienda

Abstract:

Collaboration between marketing and business intelligence (BI) is crucial in today's ever-evolving business landscape. These two domains play pivotal roles in molding customers' experiential knowledge. Marketing insights offer valuable information regarding customer needs, preferences, and behaviors, thus refining marketing strategies and enhancing overall customer experiences. Conversely, BI facilitates data-driven decision-making, leading to heightened operational efficiency, product quality, and customer satisfaction. The analysis of customer data through BI unveils patterns and trends, informing product development, marketing campaigns, and customer service initiatives aimed at enriching experiences and knowledge. Customer experiential knowledge (CEK) encompasses customers' implicit comprehension of consumption experiences influenced by diverse factors, including social and cultural influences. This study primarily focuses on telecommunications companies in Jordan, scrutinizing how experiential customer knowledge mediates the relationship between marketing intelligence, business intelligence, and innovation in product and service offerings. Drawing on theoretical frameworks such as the resource-based view (RBV) and service-dominant logic (SDL), the research aims to comprehend how organizations utilize their resources, particularly knowledge, to foster innovation. Employing a quantitative research approach, the study collected and analyzed primary data to explore hypotheses. The chosen method was justified for its efficacy in handling large sample sizes. Structural equation modeling (SEM) facilitated by Smart PLS software evaluated the relationships between the constructs, followed by mediation analysis to assess the indirect associations in the model. The study findings offer insights into the intricate dynamics of organizational innovation, uncovering the interconnected relationships between business intelligence, customer experiential knowledge-based innovation (CEK-DI), marketing intelligence (MI), and product and service innovation (PSI), underscoring the pivotal role of advanced intelligence capabilities in developing innovative practices rooted in a profound understanding of customer experiences. Organizations equipped with cutting-edge BI tools are better positioned to devise strategies informed by precise insights into customer needs and behaviors. Furthermore, the positive impact of BI on PSI reaffirms the significance of data-driven decision-making in shaping the innovation landscape. Companies leveraging BI demonstrate adeptness in identifying market opportunities guiding the development of novel products and services. The substantial impact of CEK-DI on PSI highlights the crucial role of customer experiences in driving organizational innovation. Firms actively integrating customer insights into their innovation processes are more likely to create offerings aligned with customer expectations, fostering higher levels of product and service innovation. Additionally, the positive and significant effect of MI on CEK-DI underscores the critical role of market insights in shaping innovative strategies. While the relationship between MI and PSI is positive, a slightly weaker significance level indicates a nuanced association, suggesting that while MI contributes to innovation, other factors may also influence the innovation landscape, warranting further exploration. In conclusion, the study underscores the essential role of intelligence capabilities, particularly artificial intelligence, in driving innovation, emphasizing the necessity for organizations to leverage market and customer intelligence for effective and competitive innovation practices. Collaborative efforts between marketing and business intelligence serve as pivotal drivers of innovation, influencing experiential customer knowledge and shaping organizational strategies and practices, ultimately enhancing overall customer experiences and organizational performance.

Keywords: marketing intelligence, business intelligence, product, customer experiential knowledge-driven innovation

Procedia PDF Downloads 55
15 Extracellular Polymeric Substances Study in an MBR System for Fouling Control

Authors: Dimitra C. Banti, Gesthimani Liona, Petros Samaras, Manasis Mitrakas

Abstract:

Municipal and industrial wastewaters are often treated biologically, by the activated sludge process (ASP). The ASP not only requires large aeration and sedimentation tanks, but also generates large quantities of excess sludge. An alternative technology is the membrane bioreactor (MBR), which replaces two stages of the conventional ASP—clarification and settlement—with a single, integrated biotreatment and clarification step. The advantages offered by the MBR over conventional treatment include reduced footprint and sludge production through maintaining a high biomass concentration in the bioreactor. Notwithstanding these advantages, the widespread application of the MBR process is constrained by membrane fouling. Fouling leads to permeate flux decline, making more frequent membrane cleaning and replacement necessary and resulting to increased operating costs. In general, membrane fouling results from the interaction between the membrane material and the components in the activated sludge liquor. The latter includes substrate components, cells, cell debris and microbial metabolites, such as Extracellular Polymeric Substances (EPS) and Sludge Microbial Products (SMPs). The challenge for effective MBR operation is to minimize the rate of Transmembrane Pressure (TMP) increase. This can be achieved by several ways, one of which is the addition of specific additives, that enhance the coagulation and flocculation of compounds, which are responsible for fouling, hence reducing biofilm formation on the membrane surface and limiting the fouling rate. In this project the effectiveness of a non-commercial composite coagulant was studied as an agent for fouling control in a lab scale MBR system consisting in two aerated tanks. A flat sheet membrane module with 0.40 um pore size was submerged into the second tank. The system was fed by50 L/d of municipal wastewater collected from the effluent of the primary sedimentation basin. The TMP increase rate, which is directly related to fouling growth, was monitored by a PLC system. EPS, MLSS and MLVSS measurements were performed in samples of mixed liquor; in addition, influent and effluent samples were collected for the determination of physicochemical characteristics (COD, BOD5, NO3-N, NH4-N, Total N and PO4-P). The coagulant was added in concentrations 2, 5 and 10mg/L during a period of 2 weeks and the results were compared with the control system (without coagulant addition). EPS fractions were extracted by a three stages physical-thermal treatment allowing the identification of Soluble EPS (SEPS) or SMP, Loosely Bound EPS (LBEPS) and Tightly Bound EPS (TBEPS). Proteins and carbohydrates concentrations were measured in EPS fractions by the modified Lowry method and Dubois method, respectively. Addition of 2 mg/L coagulant concentration did not affect SEPS proteins in comparison with control process and their values varied between 32 to 38mg/g VSS. However a coagulant dosage of 5mg/L resulted in a slight increase of SEPS proteins at 35-40 mg/g VSS while 10mg/L coagulant further increased SEPS to 44-48mg/g VSS. Similar results were obtained for SEPS carbohydrates. Carbohydrates values without coagulant addition were similar to the corresponding values measured for 2mg/L coagulant; the addition of mg/L coagulant resulted to a slight increase of carbohydrates SEPS to 6-7mg/g VSS while a dose of 10 mg/L further increased carbohydrates content to 9-10mg/g VSS. Total LBEPS and TBEPS, consisted of proteins and carbohydrates of LBEPS and TBEPS respectively, presented similar variations by the addition of the coagulant. Total LBEPS at 2mg/L dose were almost equal to 17mg/g VSS, and their values increased to 22 and 29 mg/g VSS during the addition of 5 mg/L and 10 mg/L of coagulant respectively. Total TBEPS were almost 37 mg/g VSS at a coagulant dose of 2 mg/L and increased to 42 and 51 mg/g VSS at 5 mg/L and 10 mg/L doses, respectively. Therefore, it can be concluded that coagulant addition could potentially affect microorganisms activities, excreting EPS in greater amounts. Nevertheless, EPS increase, mainly SEPS increase, resulted to a higher membrane fouling rate, as justified by the corresponding TMP increase rate. However, the addition of the coagulant, although affected the EPS content in the reactor mixed liquor, did not change the filtration process: an effluent of high quality was produced, with COD values as low as 20-30 mg/L.

Keywords: extracellular polymeric substances, MBR, membrane fouling, EPS

Procedia PDF Downloads 256
14 A Low-Cost Disposable PDMS Microfluidic Cartridge with Reagent Storage Silicone Blisters for Isothermal DNA Amplification

Authors: L. Ereku, R. E. Mackay, A. Naveenathayalan, K. Ajayi, W. Balachandran

Abstract:

Over the past decade the increase of sexually transmitted infections (STIs) especially in the developing world due to high cost and lack of sufficient medical testing have given rise to the need for a rapid, low cost point of care medical diagnostic that is disposable and most significantly reproduces equivocal results achieved within centralised laboratories. This paper present the development of a disposable PDMS microfluidic cartridge incorporating blisters filled with reagents required for isothermal DNA amplification in clinical diagnostics and point-of-care testing. In view of circumventing the necessity for external complex microfluidic pumps, designing on-chip pressurised fluid reservoirs is embraced using finger actuation and blister storage. The fabrication of the blisters takes into consideration three proponents that include: material characteristics, fluid volume and structural design. Silicone rubber is the chosen material due to its good chemical stability, considerable tear resistance and moderate tension/compression strength. The case of fluid capacity and structural form go hand in hand as the reagent need for the experimental analysis determines the volume size of the blisters, whereas the structural form has to be designed to provide low compression stress when deformed for fluid expulsion. Furthermore, the top and bottom section of the blisters are embedded with miniature polar opposite magnets at a defined parallel distance. These magnets are needed to lock or restrain the blisters when fully compressed so as to prevent unneeded backflow as a result of elasticity. The integrated chip is bonded onto a large microscope glass slide (50mm x 75mm). Each part is manufactured using a 3D printed mould designed using Solidworks software. Die-casting is employed, using 3D printed moulds, to form the deformable blisters by forcing a proprietary liquid silicone rubber through the positive mould cavity. The set silicone rubber is removed from the cast and prefilled with liquid reagent and then sealed with a thin (0.3mm) burstable layer of recast silicone rubber. The main microfluidic cartridge is fabricated using classical soft lithographic techniques. The cartridge incorporates microchannel circuitry, mixing chamber, inlet port, outlet port, reaction chamber and waste chamber. Polydimethylsiloxane (PDMS, QSil 216) is mixed and degassed using a centrifuge (ratio 10:1) is then poured after the prefilled blisters are correctly positioned on the negative mould. Heat treatment of about 50C to 60C in the oven for about 3hours is needed to achieve curing. The latter chip production stage involves bonding the cured PDMS to the glass slide. A plasma coroner treater device BD20-AC (Electro-Technic Products Inc., US) is used to activate the PDMS and glass slide before they are both joined and adequately compressed together, then left in the oven over the night to ensure bonding. There are two blisters in total needed for experimentation; the first will be used as a wash buffer to remove any remaining cell debris and unbound DNA while the second will contain 100uL amplification reagents. This paper will present results of chemical cell lysis, extraction using a biopolymer paper membrane and isothermal amplification on a low-cost platform using the finger actuated blisters for reagent storage. The platform has been shown to detect 1x105 copies of Chlamydia trachomatis using Recombinase Polymerase Amplification (RPA).

Keywords: finger actuation, point of care, reagent storage, silicone blisters

Procedia PDF Downloads 362
13 Exploring Factors That May Contribute to the Underdiagnosis of Hereditary Transthyretin Amyloidosis in African American Patients

Authors: Kelsi Hagerty, Ami Rosen, Aaliyah Heyward, Nadia Ali, Emily Brown, Erin Demo, Yue Guan, Modele Ogunniyi, Brianna McDaniels, Alanna Morris, Kunal Bhatt

Abstract:

Hereditary transthyretin amyloidosis (hATTR) is a progressive, multi-systemic, and life-threatening disease caused by a disruption in the TTR protein that delivers thyroxine and retinol to the liver. This disruption causes the protein to misfold into amyloid fibrils, leading to the accumulation of the amyloid fibrils in the heart, nerves, and GI tract. Over 130 variants in the TTR gene are known to cause hATTR. The Val122Ile variant is the most common in the United States and is seen almost exclusively in people of African descent. TTR variants are inherited in an autosomal dominant fashion and have incomplete penetrance and variable expressivity. Individuals with hATTR may exhibit symptoms from as early as 30 years to as late as 80 years of age. hATTR is characterized by a wide range of clinical symptoms such as cardiomyopathy, neuropathy, carpal tunnel syndrome, and GI complications. Without treatment, hATTR leads to progressive disease and can ultimately lead to heart failure. hATTR disproportionately affects individuals of African descent; the estimated prevalence of hATTR among Black individuals in the US is 3.4%. Unfortunately, hATTR is often underdiagnosed and misdiagnosed because many symptoms of the disease overlap with other cardiac conditions. Due to the progressive nature of the disease, multi-systemic manifestations that can lead to a shortened lifespan, and the availability of free genetic testing and promising FDA-approved therapies that enhance treatability, early identification of individuals with a pathogenic hATTR variant is important, as this can significantly impact medical management for patients and their relatives. Furthermore, recent literature suggests that TTR genetic testing should be performed in all patients with suspicion of TTR-related cardiomyopathy, regardless of age, and that follow-up with genetic counseling services is recommended. Relatives of patients with hATTR benefit from genetic testing because testing can identify carriers early and allow relatives to receive regular screening and management. Despite the striking prevalence of hATTR among Black individuals, hATTR remains underdiagnosed in this patient population, and germline genetic testing for hATTR in Black individuals seems to be underrepresented, though the reasons for this have not yet been brought to light. Historically, Black patients experience a number of barriers to seeking healthcare that has been hypothesized to perpetuate the underdiagnosis of hATTR, such as lack of access and mistrust of healthcare professionals. Prior research has described a myriad of factors that shape an individual’s decision about whether to pursue presymptomatic genetic testing for a familial pathogenic variant, such as family closeness and communication, family dynamics, and a desire to inform other family members about potential health risks. This study explores these factors through 10 in-depth interviews with patients with hATTR about what factors may be contributing to the underdiagnosis of hATTR in the Black population. Participants were selected from the Emory University Amyloidosis clinic based on having a molecular diagnosis of hATTR. Interviews were recorded and transcribed verbatim, then coded using MAXQDA software. Thematic analysis was completed to draw commonalities between participants. Upon preliminary analysis, several themes have emerged. Barriers identified include i) Misdiagnosis and a prolonged diagnostic odyssey, ii) Family communication and dynamics surrounding health issues, iii) Perceptions of healthcare and one’s own health risks, and iv) The need for more intimate provider-patient relationships and communication. Overall, this study gleaned valuable insight from members of the Black community about possible factors contributing to the underdiagnosis of hATTR, as well as potential solutions to go about resolving this issue.

Keywords: cardiac amyloidosis, heart failure, TTR, genetic testing

Procedia PDF Downloads 91
12 Developing a Cloud Intelligence-Based Energy Management Architecture Facilitated with Embedded Edge Analytics for Energy Conservation in Demand-Side Management

Authors: Yu-Hsiu Lin, Wen-Chun Lin, Yen-Chang Cheng, Chia-Ju Yeh, Yu-Chuan Chen, Tai-You Li

Abstract:

Demand-Side Management (DSM) has the potential to reduce electricity costs and carbon emission, which are associated with electricity used in the modern society. A home Energy Management System (EMS) commonly used by residential consumers in a down-stream sector of a smart grid to monitor, control, and optimize energy efficiency to domestic appliances is a system of computer-aided functionalities as an energy audit for residential DSM. Implementing fault detection and classification to domestic appliances monitored, controlled, and optimized is one of the most important steps to realize preventive maintenance, such as residential air conditioning and heating preventative maintenance in residential/industrial DSM. In this study, a cloud intelligence-based green EMS that comes up with an Internet of Things (IoT) technology stack for residential DSM is developed. In the EMS, Arduino MEGA Ethernet communication-based smart sockets that module a Real Time Clock chip to keep track of current time as timestamps via Network Time Protocol are designed and implemented for readings of load phenomena reflecting on voltage and current signals sensed. Also, a Network-Attached Storage providing data access to a heterogeneous group of IoT clients via Hypertext Transfer Protocol (HTTP) methods is configured to data stores of parsed sensor readings. Lastly, a desktop computer with a WAMP software bundle (the Microsoft® Windows operating system, Apache HTTP Server, MySQL relational database management system, and PHP programming language) serves as a data science analytics engine for dynamic Web APP/REpresentational State Transfer-ful web service of the residential DSM having globally-Advanced Internet of Artificial Intelligence (AI)/Computational Intelligence. Where, an abstract computing machine, Java Virtual Machine, enables the desktop computer to run Java programs, and a mash-up of Java, R language, and Python is well-suited and -configured for AI in this study. Having the ability of sending real-time push notifications to IoT clients, the desktop computer implements Google-maintained Firebase Cloud Messaging to engage IoT clients across Android/iOS devices and provide mobile notification service to residential/industrial DSM. In this study, in order to realize edge intelligence that edge devices avoiding network latency and much-needed connectivity of Internet connections for Internet of Services can support secure access to data stores and provide immediate analytical and real-time actionable insights at the edge of the network, we upgrade the designed and implemented smart sockets to be embedded AI Arduino ones (called embedded AIduino). With the realization of edge analytics by the proposed embedded AIduino for data analytics, an Arduino Ethernet shield WizNet W5100 having a micro SD card connector is conducted and used. The SD library is included for reading parsed data from and writing parsed data to an SD card. And, an Artificial Neural Network library, ArduinoANN, for Arduino MEGA is imported and used for locally-embedded AI implementation. The embedded AIduino in this study can be developed for further applications in manufacturing industry energy management and sustainable energy management, wherein in sustainable energy management rotating machinery diagnostics works to identify energy loss from gross misalignment and unbalance of rotating machines in power plants as an example.

Keywords: demand-side management, edge intelligence, energy management system, fault detection and classification

Procedia PDF Downloads 245
11 Settlement Prediction in Cape Flats Sands Using Shear Wave Velocity – Penetration Resistance Correlations

Authors: Nanine Fouche

Abstract:

The Cape Flats is a low-lying sand-covered expanse of approximately 460 square kilometres, situated to the southeast of the central business district of Cape Town in the Western Cape of South Africa. The aeolian sands masking this area are often loose and compressible in the upper 1m to 1.5m of the surface, and there is a general exceedance of the maximum allowable settlement in these sands. The settlement of shallow foundations on Cape Flats sands is commonly predicted using the results of in-situ tests such as the SPT or DPSH due to the difficulty of retrieving undisturbed samples for laboratory testing. Varying degrees of accuracy and reliability are associated with these methods. More recently, shear wave velocity (Vs) profiles obtained from seismic testing, such as continuous surface wave tests (CSW), are being used for settlement prediction. Such predictions have the advantage of considering non-linear stress-strain behaviour of soil and the degradation of stiffness with increasing strain. CSW tests are rarely executed in the Cape Flats, whereas SPT’s are commonly performed. For this reason, and to facilitate better settlement predictions in Cape Flats sand, equations representing shear wave velocity (Vs) as a function of SPT blow count (N60) and vertical effective stress (v’) were generated by statistical regression of site investigation data. To reveal the most appropriate method of overburden correction, analyses were performed with a separate overburden term (Pa/σ’v) as well as using stress corrected shear wave velocity and SPT blow counts (correcting Vs. and N60 to Vs1and (N1)60respectively). Shear wave velocity profiles and SPT blow count data from three sites masked by Cape Flats sands were utilised to generate 80 Vs-SPT N data pairs for analysis. Investigated terrains included sites in the suburbs of Athlone, Muizenburg, and Atlantis, all underlain by windblown deposits comprising fine and medium sand with varying fines contents. Elastic settlement analysis was also undertaken for the Cape Flats sands, using a non-linear stepwise method based on small-strain stiffness estimates, which was obtained from the best Vs-N60 model and compared to settlement estimates using the general elastic solution with stiffness profiles determined using Stroud’s (1989) and Webb’s (1969) SPT N60-E transformation models. Stroud’s method considers strain level indirectly whereasWebb’smethod does not take account of the variation in elastic modulus with strain. The expression of Vs. in terms of N60 and Pa/σv’ derived from the Atlantis data set revealed the best fit with R2 = 0.83 and a standard error of 83.5m/s. Less accurate Vs-SPT N relations associated with the combined data set is presumably the result of inversion routines used in the analysis of the CSW results showcasing significant variation in relative density and stiffness with depth. The regression analyses revealed that the inclusion of a separate overburden term in the regression of Vs and N60, produces improved fits, as opposed to the stress corrected equations in which the R2 of the regression is notably lower. It is the correction of Vs and N60 to Vs1 and (N1)60 with empirical constants ‘n’ and ‘m’ prior to regression, that introduces bias with respect to overburden pressure. When comparing settlement prediction methods, both Stroud’s method (considering strain level indirectly) and the small strain stiffness method predict higher stiffnesses for medium dense and dense profiles than Webb’s method, which takes no account of strain level in the determination of soil stiffness. Webb’s method appears to be suitable for loose sands only. The Versak software appears to underestimate differences in settlement between square and strip footings of similar width. In conclusion, settlement analysis using small-strain stiffness data from the proposed Vs-N60 model for Cape Flats sands provides a way to take account of the non-linear stress-strain behaviour of the sands when calculating settlement.

Keywords: sands, settlement prediction, continuous surface wave test, small-strain stiffness, shear wave velocity, penetration resistance

Procedia PDF Downloads 166
10 Climate Change Threats to UNESCO-Designated World Heritage Sites: Empirical Evidence from Konso Cultural Landscape, Ethiopia

Authors: Yimer Mohammed Assen, Abiyot Legesse Kura, Engida Esyas Dube, Asebe Regassa Debelo, Girma Kelboro Mensuro, Lete Bekele Gure

Abstract:

Climate change has posed severe threats to many cultural landscapes of UNESCO world heritage sites recently. The UNESCO State of Conservation (SOC) reports categorized flooding, temperature increment, and drought as threats to cultural landscapes. This study aimed to examine variations and trends of rainfall and temperature extreme events and their threats to the UNESCO-designated Konso Cultural Landscape in southern Ethiopia. The study used dense merged satellite-gauge station rainfall data (1981-2020) with spatial resolution of 4km by 4km and observed maximum and minimum temperature data (1987-2020). Qualitative data were also gathered from cultural leaders, local administrators, and religious leaders using structured interview checklists. The spatial patterns, coefficient of variation, standardized anomalies, trends, and magnitude of change of rainfall and temperature extreme events both at annual and seasonal levels were computed using the Mann-Kendall trend test and Sen’s slope estimator under the CDT package. The standard precipitation index (SPI) was also used to calculate drought severity, frequency, and trend maps. The data gathered from key informant interviews and focus group discussions were coded and analyzed thematically to complement statistical findings. Thematic areas that explain the impacts of extreme events on the cultural landscape were chosen for coding. The thematic analysis was conducted using Nvivo software. The findings revealed that rainfall was highly variable and unpredictable, resulting in extreme drought and flood. There were significant (P<0.05) increasing trends of heavy rainfall (R10mm and R20mm) and the total amount of rain on wet days (PRCPTOT), which might have resulted in flooding. The study also confirmed that absolute temperature extreme indices (TXx, TXn, and TNx) and the percentile-based temperature extreme indices (TX90p, TN90p, TX10p, and TN10P) showed significant (P<0.05) increasing trends which are signals for warming of the study area. The results revealed that the frequency as well as the severity of drought at 3-months (katana/hageya seasons) was more pronounced than the 12-months (annual) time scale. The highest number of droughts in 100 years is projected at a 3-months timescale across the study area. The findings also showed that frequent drought has led to loss of grasses which are used for making traditional individual houses and multipurpose communal houses (pafta), food insecurity, migration, loss of biodiversity, and commodification of stones from terrace. On the other hand, the increasing trends of rainfall extreme indices resulted in destruction of terraces, soil erosion, loss of life and damage of properties. The study shows that a persistent decline in farmland productivity, due to erratic and extreme rainfall and frequent drought occurrences, forced the local people to participate in non-farm activities and retreat from daily preservation and management of their landscape. Overall, the increasing rainfall and temperature extremes coupled with prevalence of drought are thought to have an impact on the sustainability of cultural landscape through disrupting the ecosystem services and livelihood of the community. Therefore, more localized adaptation and mitigation strategies to the changing climate are needed to maintain the sustainability of Konso cultural landscapes as a global cultural treasure and to strengthen the resilience of smallholder farmers.

Keywords: adaptation, cultural landscape, drought, extremes indices

Procedia PDF Downloads 14
9 Improving Patient Journey in the Obstetrics and Gynecology Emergency Department: A Comprehensive Analysis of Patient Experience

Authors: Lolwa Alansari, Abdelhamid Azhaghdani, Sufia Athar, Hanen Mrabet, Annaliza Cruz, Tamara Alshadafat, Almunzer Zakaria

Abstract:

Introduction: Improving the patient experience is a fundamental pillar of healthcare's quadruple aims. Recognizing the importance of patient experiences and perceptions in healthcare interactions is pivotal for driving quality improvement. This abstract centers around the Patient Experience Program, an endeavor crafted with the purpose of comprehending and elevating the experiences of patients in the Obstetrics & Gynecology Emergency Department (OB/GYN ED). Methodology: This comprehensive endeavor unfolded through a structured sequence of phases following Plan-Do-Study-Act (PDSA) model, spanning over 12 months, focused on enhancing patient experiences in the Obstetrics & Gynecology Emergency Department (OB/GYN ED). The study meticulously examined the journeys of patients with acute obstetrics and gynecological conditions, collecting data from over 100 participants monthly. The inclusive approach covered patients of different priority levels (1-5) admitted for acute conditions, with no exclusions. Historical data from March and April 2022 serves as a benchmark for comparison, strengthening causality claims by providing a baseline understanding of OB/GYN ED performance before interventions. Additionally, the methodology includes the incorporation of staff engagement surveys to comprehensively understand the experiences of healthcare professionals with the implemented improvements. Data extraction involved administering open-ended questions and comment sections to gather rich qualitative insights. The survey covered various aspects of the patient journey, including communication, emotional support, timely access to care, care coordination, and patient-centered decision-making. The project's data analysis utilized a mixed-methods approach, combining qualitative techniques to identify recurring themes and extract actionable insights and quantitative methods to assess patient satisfaction scores and relevant metrics over time, facilitating the measurement of intervention impact and longitudinal tracking of changes. From the themes we discovered in both the online and in-person patient experience surveys, several key findings emerged that guided us in initiating improvements, including effective communication and information sharing, providing emotional support and empathy, ensuring timely access to care, fostering care coordination and continuity, and promoting patient-centered decision-making. Results: The project yielded substantial positive outcomes, significantly improving patient experiences in the OB/GYN ED. Patient satisfaction levels rose from 62% to a consistent 98%, with notable improvements in satisfaction with care plan information and physician care. Waiting time satisfaction increased from 68% to a steady 97%. The project positively impacted nurses' and midwives' job satisfaction, increasing from 64% to an impressive 94%. Operational metrics displayed positive trends, including a decrease in the "left without being seen" rate from 3% to 1%, the discharge against medical advice rate dropping from 8% to 1%, and the absconded rate reducing from 3% to 0%. These outcomes underscore the project's effectiveness in enhancing both patient and staff experiences in the healthcare setting. Conclusion: The use of a patient experience questionnaire has been substantiated by evidence-based research as an effective tool for improving the patient experience, guiding interventions, and enhancing overall healthcare quality in the OB/GYN ED. The project's interventions have resulted in a more efficient allocation of resources, reduced hospital stays, and minimized unnecessary resource utilization. This, in turn, contributes to cost savings for the healthcare facility.

Keywords: patient experience, patient survey, person centered care, quality initiatives

Procedia PDF Downloads 50
8 Full Characterization of Heterogeneous Antibody Samples under Denaturing and Native Conditions on a Hybrid Quadrupole-Orbitrap Mass Spectrometer

Authors: Rowan Moore, Kai Scheffler, Eugen Damoc, Jennifer Sutton, Aaron Bailey, Stephane Houel, Simon Cubbon, Jonathan Josephs

Abstract:

Purpose: MS analysis of monoclonal antibodies (mAbs) at the protein and peptide levels is critical during development and production of biopharmaceuticals. The compositions of current generation therapeutic proteins are often complex due to various modifications which may affect efficacy. Intact proteins analyzed by MS are detected in higher charge states that also provide more complexity in mass spectra. Protein analysis in native or native-like conditions with zero or minimal organic solvent and neutral or weakly acidic pH decreases charge state value resulting in mAb detection at higher m/z ranges with more spatial resolution. Methods: Three commercially available mAbs were used for all experiments. Intact proteins were desalted online using size exclusion chromatography (SEC) or reversed phase chromatography coupled on-line with a mass spectrometer. For streamlined use of the LC- MS platform we used a single SEC column and alternately selected specific mobile phases to perform separations in either denaturing or native-like conditions: buffer A (20 % ACN, 0.1 % FA) with Buffer B (100 mM ammonium acetate). For peptide analysis mAbs were proteolytically digested with and without prior reduction and alkylation. The mass spectrometer used for all experiments was a commercially available Thermo Scientific™ hybrid Quadrupole-Orbitrap™ mass spectrometer, equipped with the new BioPharma option which includes a new High Mass Range (HMR) mode that allows for improved high mass transmission and mass detection up to 8000 m/z. Results: We have analyzed the profiles of three mAbs under reducing and native conditions by direct infusion with offline desalting and with on-line desalting via size exclusion and reversed phase type columns. The presence of high salt under denaturing conditions was found to influence the observed charge state envelope and impact mass accuracy after spectral deconvolution. The significantly lower charge states observed under native conditions improves the spatial resolution of protein signals and has significant benefits for the analysis of antibody mixtures, e.g. lysine variants, degradants or sequence variants. This type of analysis requires the detection of masses beyond the standard mass range ranging up to 6000 m/z requiring the extended capabilities available in the new HMR mode. We have compared each antibody sample that was analyzed individually with mixtures in various relative concentrations. For this type of analysis, we observed that apparent native structures persist and ESI is benefited by the addition of low amounts of acetonitrile and formic acid in combination with the ammonium acetate-buffered mobile phase. For analyses on the peptide level we analyzed reduced/alkylated, and non-reduced proteolytic digests of the individual antibodies separated via reversed phase chromatography aiming to retrieve as much information as possible regarding sequence coverage, disulfide bridges, post-translational modifications such as various glycans, sequence variants, and their relative quantification. All data acquired were submitted to a single software package for analysis aiming to obtain a complete picture of the molecules analyzed. Here we demonstrate the capabilities of the mass spectrometer to fully characterize homogeneous and heterogeneous therapeutic proteins on one single platform. Conclusion: Full characterization of heterogeneous intact protein mixtures by improved mass separation on a quadrupole-Orbitrap™ mass spectrometer with extended capabilities has been demonstrated.

Keywords: disulfide bond analysis, intact analysis, native analysis, mass spectrometry, monoclonal antibodies, peptide mapping, post-translational modifications, sequence variants, size exclusion chromatography, therapeutic protein analysis, UHPLC

Procedia PDF Downloads 355
7 Development Programmes Requirements for Managing and Supporting the Ever-Dynamic Job Roles of Middle Managers in Higher Education Institutions: The Espousal Demanded from Human Resources Department; Case Studies of a New University in United Kingdom

Authors: Mohamed Sameer Mughal, Andrew D. Ross, Damian J. Fearon

Abstract:

Background: The fast-paced changing landscape of UK Higher Education Institution (HEIs) is poised by changes and challenges affecting Middle Managers (MM) in their job roles. MM contribute to the success of HEIs by balancing the equilibrium and pass organization strategies from senior staff towards operationalization directives to junior staff. However, this study showcased from the data analyzed during the semi structured interviews; MM job role is becoming more complex due to changes and challenges creating colossal pressures and workloads in day-to-day working. Current development programmes provisions by Human Resources (HR) departments in such HEIs are not feasible, applicable, and matching the true essence and requirements of MM who suggest that programmes offered by HR are too generic to suit their precise needs and require tailor made espousal to work effectively in their pertinent job roles. Methodologies: This study aims to capture demands of MM Development Needs (DN) by means of a conceptual model as conclusive part of the research that is divided into 2 phases. Phase 1 initiated by carrying out 2 pilot interviews with a retired Emeritus status professor and HR programmes development coordinator. Key themes from the pilot and literature review subsidized into formulation of 22 set of questions (Kvale and Brinkmann) in form of interviewing questionnaire during qualitative data collection. Data strategy and collection consisted of purposeful sampling of 12 semi structured interviews (n=12) lasting approximately an hour for all participants. The MM interviewed were at faculty and departmental levels which included; deans (n=2), head of departments (n=4), subject leaders (n=2), and lastly programme leaders (n=4). Participants recruitment was carried out via emails and snowballing technique. The interviews data was transcribed (verbatim) and managed using Computer Assisted Qualitative Data Analysis using Nvivo ver.11 software. Data was meticulously analyzed using Miles and Huberman inductive approach of positivistic style grounded theory, whereby key themes and categories emerged from the rich data collected. The data was precisely coded and classified into case studies (Robert Yin); with a main case study, sub cases (4 classes of MM) and embedded cases (12 individual MMs). Major Findings: An interim conceptual model emerged from analyzing the data with main concepts that included; key performance indicators (KPI’s), HEI effectiveness and outlook, practices, processes and procedures, support mechanisms, student events, rules, regulations and policies, career progression, reporting/accountability, changes and challenges, and lastly skills and attributes. Conclusion: Dynamic elements affecting MM includes; increase in government pressures, student numbers, irrelevant development programmes, bureaucratic structures, transparency and accountability, organization policies, skills sets… can only be confronted by employing structured development programmes originated by HR that are not provided generically. Future Work: Stage 2 (Quantitative method) of the study plans to validate the interim conceptual model externally through fully completed online survey questionnaire (Bram Oppenheim) from external HEIs (n=150). The total sample targeted is 1500 MM. Author contribution focuses on enhancing management theory and narrow the gap between by HR and MM development programme provision.

Keywords: development needs (DN), higher education institutions (HEIs), human resources (HR), middle managers (MM)

Procedia PDF Downloads 224
6 A Spatial Repetitive Controller Applied to an Aeroelastic Model for Wind Turbines

Authors: Riccardo Fratini, Riccardo Santini, Jacopo Serafini, Massimo Gennaretti, Stefano Panzieri

Abstract:

This paper presents a nonlinear differential model, for a three-bladed horizontal axis wind turbine (HAWT) suited for control applications. It is based on a 8-dofs, lumped parameters structural dynamics coupled with a quasi-steady sectional aerodynamics. In particular, using the Euler-Lagrange Equation (Energetic Variation approach), the authors derive, and successively validate, such model. For the derivation of the aerodynamic model, the Greenbergs theory, an extension of the theory proposed by Theodorsen to the case of thin airfoils undergoing pulsating flows, is used. Specifically, in this work, the authors restricted that theory under the hypothesis of low perturbation reduced frequency k, which causes the lift deficiency function C(k) to be real and equal to 1. Furthermore, the expressions of the aerodynamic loads are obtained using the quasi-steady strip theory (Hodges and Ormiston), as a function of the chordwise and normal components of relative velocity between flow and airfoil Ut, Up, their derivatives, and section angular velocity ε˙. For the validation of the proposed model, the authors carried out open and closed-loop simulations of a 5 MW HAWT, characterized by radius R =61.5 m and by mean chord c = 3 m, with a nominal angular velocity Ωn = 1.266rad/sec. The first analysis performed is the steady state solution, where a uniform wind Vw = 11.4 m/s is considered and a collective pitch angle θ = 0.88◦ is imposed. During this step, the authors noticed that the proposed model is intrinsically periodic due to the effect of the wind and of the gravitational force. In order to reject this periodic trend in the model dynamics, the authors propose a collective repetitive control algorithm coupled with a PD controller. In particular, when the reference command to be tracked and/or the disturbance to be rejected are periodic signals with a fixed period, the repetitive control strategies can be applied due to their high precision, simple implementation and little performance dependency on system parameters. The functional scheme of a repetitive controller is quite simple and, given a periodic reference command, is composed of a control block Crc(s) usually added to an existing feedback control system. The control block contains and a free time-delay system eτs in a positive feedback loop, and a low-pass filter q(s). It should be noticed that, while the time delay term reduces the stability margin, on the other hand the low pass filter is added to ensure stability. It is worth noting that, in this work, the authors propose a phase shifting for the controller and the delay system has been modified as e^(−(T−γk)), where T is the period of the signal and γk is a phase shifting of k samples of the same periodic signal. It should be noticed that, the phase shifting technique is particularly useful in non-minimum phase systems, such as flexible structures. In fact, using the phase shifting, the iterative algorithm could reach the convergence also at high frequencies. Notice that, in our case study, the shifting of k samples depends both on the rotor angular velocity Ω and on the rotor azimuth angle Ψ: we refer to this controller as a spatial repetitive controller. The collective repetitive controller has also been coupled with a C(s) = PD(s), in order to dampen oscillations of the blades. The performance of the spatial repetitive controller is compared with an industrial PI controller. In particular, starting from wind speed velocity Vw = 11.4 m/s the controller is asked to maintain the nominal angular velocity Ωn = 1.266rad/s after an instantaneous increase of wind speed (Vw = 15 m/s). Then, a purely periodic external disturbance is introduced in order to stress the capabilities of the repetitive controller. The results of the simulations show that, contrary to a simple PI controller, the spatial repetitive-PD controller has the capability to reject both external disturbances and periodic trend in the model dynamics. Finally, the nominal value of the angular velocity is reached, in accordance with results obtained with commercial software for a turbine of the same type.

Keywords: wind turbines, aeroelasticity, repetitive control, periodic systems

Procedia PDF Downloads 239
5 Translation of Self-Inject Contraception Training Objectives Into Service Performance Outcomes

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Simeon Christian Chukwu, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background: Health service providers are offered in-service training periodically to strengthen their ability to deliver services that are ethical, quality, timely and safe. Not all capacity-building courses have successfully resulted in intended service delivery outcomes because of poor training content, design, approach, and ambiance. The Delivering Innovations in Selfcare (DISC) project developed a Moment of Truth innovation, which is a proven training model focused on improving consumer/provider interaction that leads to an increase in the voluntary uptake of subcutaneous depot medroxyprogesterone acetate (DMPA-SC) self-injection among women who opt for injectable contraception. Methodology: Six months after training on a moment of truth (MoT) training manual, the project conducted two intensive rounds of qualitative data collection and triangulation that included provider, client, and community mobilizer interviews, facility observations, and routine program data collection. Respondents were sampled according to a convenience sampling approach, and data collected was analyzed using a codebook and Atlas-TI. Providers and clients were interviewed to understand their experience, perspective, attitude, and awareness about the DMPA-SC self-inject. Data were collected from 12 health facilities in three states – eight directly trained and four cascades trained. The research team members came together for a participatory analysis workshop to explore and interpret emergent themes. Findings: Quality-of-service delivery and performance outcomes were observed to be significantly better in facilities whose providers were trained directly trained by the DISC project than in sites that received indirect training through master trainers. Facilities that were directly trained recorded SI proportions that were twice more than in cascade-trained sites. Direct training comprised of full-day and standalone didactic and interactive sessions constructed to evoke commitment, passion and conviction as well as eliminate provider bias and misconceptions in providers by utilizing human interest stories and values clarification exercises. Sessions also created compelling arguments using evidence and national guidelines. The training also prioritized demonstration sessions, utilized job aids, particularly videos, strengthened empathetic counseling – allaying client fears and concerns about SI, trained on positioning self-inject first and side effects management. Role plays and practicum was particularly useful to enable providers to retain and internalize new knowledge. These sessions provided experiential learning and the opportunity to apply one's expertise in a supervised environment where supportive feedback is provided in real-time. Cascade Training was often a shorter and abridged form of MoT training that leveraged existing training already planned by master trainers. This training was held over a four-hour period and was less emotive, focusing more on foundational DMPA-SC knowledge such as a reorientation to DMPA-SC, comparison of DMPA-SC variants, counseling framework and skills, data reporting and commodity tracking/requisition – no facility practicums. Training on self-injection was not as robust, presumably because they were not directed at methods in the contraceptive mix that align with state/organizational sponsored objectives – in this instance, fostering LARC services. Conclusion: To achieve better performance outcomes, consideration should be given to providing training that prioritizes practice-based and emotive content. Furthermore, a firm understanding and conviction about the value training offers improve motivation and commitment to accomplish and surpass service-related performance outcomes.

Keywords: training, performance outcomes, innovation, family planning, contraception, DMPA-SC, self-care, self-injection.

Procedia PDF Downloads 72
4 Open Science Philosophy, Research and Innovation

Authors: C.Ardil

Abstract:

Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.

Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data

Procedia PDF Downloads 121
3 Improvement in the Photocatalytic Activity of Nanostructured Manganese Ferrite – Type of Materials by Mechanochemical Activation

Authors: Katerina Zaharieva, Katya Milenova, Zara Cherkezova-Zheleva, Alexander Eliyas, Boris Kunev, Ivan Mitov

Abstract:

The synthesized nanosized manganese ferrite-type of samples have been tested as photocatalysts in the reaction of oxidative degradation of model contaminant Reactive Black 5 (RB5) dye in aqueous solutions under UV irradiation. As it is known this azo dye is applied in the textile-coloring industry and it is discharged into the waterways causing pollution. The co-precipitation procedure has been used for the synthesis of manganese ferrite-type of materials: Sample 1 - Mn0.25Fe2.75O4, Sample 2 - Mn0.5Fe2.5O4 and Sample 3 - MnFe2O4 from 0.03M aqueous solutions of MnCl2•4H2O, FeCl2•4H2O and/or FeCl3•6H2O and 0.3M NaOH in appropriate amounts. The mechanochemical activation of co-precipitated ferrite-type of samples has been performed in argon (Samples 1 and 2) or in air atmosphere (Sample 3) for 2 hours at a milling speed of 500 rpm. The mechano-chemical treatment has been carried out in a high energy planetary ball mill type PM 100, Retsch, Germany. The mass ratio between balls and powder was 30:1. As a result mechanochemically activated Sample 4 - Mn0.25Fe2.75O4, Sample 5 - Mn0.5Fe2.5O4 and Sample 6 - MnFe2O4 have been obtained. The synthesized manganese ferrite-type photocatalysts have been characterized by X-ray diffraction method and Moessbauer spectroscopy. The registered X-ray diffraction patterns and Moessbauer spectra of co-precipitated ferrite-type of materials show the presence of manganese ferrite and additional akaganeite phase. The presence of manganese ferrite and small amounts of iron phases is established in the mechanochemically treated samples. The calculated average crystallite size of manganese ferrites varies within the range 7 – 13 nm. This result is confirmed by Moessbauer study. The registered spectra show superparamagnetic behavior of the prepared materials at room temperature. The photocatalytic investigations have been made using polychromatic UV-A light lamp (Sylvania BLB, 18 W) illumination with wavelength maximum at 365 nm. The intensity of light irradiation upon the manganese ferrite-type photocatalysts was 0.66 mW.cm-2. The photocatalytic reaction of oxidative degradation of RB5 dye was carried out in a semi-batch slurry photocatalytic reactor with 0.15 g of ferrite-type powder, 150 ml of 20 ppm dye aqueous solution under magnetic stirring at rate 400 rpm and continuously feeding air flow. The samples achieved adsorption-desorption equilibrium in the dark period for 30 min and then the UV-light was turned on. After regular time intervals aliquot parts from the suspension were taken out and centrifuged to separate the powder from solution. The residual concentrations of dye were established by a UV-Vis absorbance single beam spectrophotometer CamSpec M501 (UK) measuring in the wavelength region from 190 to 800 nm. The photocatalytic measurements determined that the apparent pseudo-first-order rate constants calculated by linear slopes approximating to first order kinetic equation, increase in following order: Sample 3 (1.1х10-3 min-1) < Sample 1 (2.2х10-3 min-1) < Sample 2 (3.3 х10-3 min-1) < Sample 4 (3.8х10-3 min-1) < Sample 6 (11х10-3 min-1) < Sample 5 (15.2х10-3 min-1). The mechanochemically activated manganese ferrite-type of photocatalyst samples show significantly higher degree of oxidative degradation of RB5 dye after 120 minutes of UV light illumination in comparison with co-precipitated ferrite-type samples: Sample 5 (92%) > Sample 6 (91%) > Sample 4 (63%) > Sample 2 (53%) > Sample 1 (42%) > Sample 3 (15%). Summarizing the obtained results we conclude that the mechanochemical activation leads to a significant enhancement of the degree of oxidative degradation of the RB5 dye and photocatalytic activity of tested manganese ferrite-type of catalyst samples under our experimental conditions. The mechanochemically activated Mn0.5Fe2.5O4 ferrite-type of material displays the highest photocatalytic activity (15.2х10-3 min-1) and degree of oxidative degradation of the RB5 dye (92%) compared to the other synthesized samples. Especially a significant improvement in the degree of oxidative degradation of RB5 dye (91%) has been determined for mechanochemically treated MnFe2O4 ferrite-type of sample with the highest extent of substitution of iron ions by manganese ions than in the case of the co-precipitated MnFe2O4 sample (15%). The mechanochemically activated manganese ferrite-type of samples show good photocatalytic properties in the reaction of oxidative degradation of RB5 azo dye in aqueous solutions and it could find potential application for dye removal from wastewaters originating from textile industry.

Keywords: nanostructured manganese ferrite-type materials, photocatalytic activity, Reactive Black 5, water treatment

Procedia PDF Downloads 340
2 A Comprehensive Study of Spread Models of Wildland Fires

Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran

Abstract:

These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.

Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling

Procedia PDF Downloads 72
1 Numerical Simulation of Von Karman Swirling Bioconvection Nanofluid Flow from a Deformable Rotating Disk

Authors: Ali Kadir, S. R. Mishra, M. Shamshuddin, O. Anwar Beg

Abstract:

Motivation- Rotating disk bio-reactors are fundamental to numerous medical/biochemical engineering processes including oxygen transfer, chromatography, purification and swirl-assisted pumping. The modern upsurge in biologically-enhanced engineering devices has embraced new phenomena including bioconvection of micro-organisms (photo-tactic, oxy-tactic, gyrotactic etc). The proven thermal performance superiority of nanofluids i.e. base fluids doped with engineered nanoparticles has also stimulated immense implementation in biomedical designs. Motivated by these emerging applications, we present a numerical thermofluid dynamic simulation of the transport phenomena in bioconvection nanofluid rotating disk bioreactor flow. Methodology- We study analytically and computationally the time-dependent three-dimensional viscous gyrotactic bioconvection in swirling nanofluid flow from a rotating disk configuration. The disk is also deformable i.e. able to extend (stretch) in the radial direction. Stefan blowing is included. The Buongiorno dilute nanofluid model is adopted wherein Brownian motion and thermophoresis are the dominant nanoscale effects. The primitive conservation equations for mass, radial, tangential and axial momentum, heat (energy), nanoparticle concentration and micro-organism density function are formulated in a cylindrical polar coordinate system with appropriate wall and free stream boundary conditions. A mass convective condition is also incorporated at the disk surface. Forced convection is considered i.e. buoyancy forces are neglected. This highly nonlinear, strongly coupled system of unsteady partial differential equations is normalized with the classical Von Karman and other transformations to render the boundary value problem (BVP) into an ordinary differential system which is solved with the efficient Adomian decomposition method (ADM). Validation with earlier Runge-Kutta shooting computations in the literature is also conducted. Extensive computations are presented (with the aid of MATLAB symbolic software) for radial and circumferential velocity components, temperature, nanoparticle concentration, micro-organism density number and gradients of these functions at the disk surface (radial local skin friction, local circumferential skin friction, Local Nusselt number, Local Sherwood number, motile microorganism mass transfer rate). Main Findings- Increasing radial stretching parameter decreases radial velocity and radial skin friction, reduces azimuthal velocity and skin friction, decreases local Nusselt number and motile micro-organism mass wall flux whereas it increases nano-particle local Sherwood number. Disk deceleration accelerates the radial flow, damps the azimuthal flow, decreases temperatures and thermal boundary layer thickness, depletes the nano-particle concentration magnitudes (and associated nano-particle species boundary layer thickness) and furthermore decreases the micro-organism density number and gyrotactic micro-organism species boundary layer thickness. Increasing Stefan blowing accelerates the radial flow and azimuthal (circumferential flow), elevates temperatures of the nanofluid, boosts nano-particle concentration (volume fraction) and gyrotactic micro-organism density number magnitudes whereas suction generates the reverse effects. Increasing suction effect reduces radial skin friction and azimuthal skin friction, local Nusselt number, and motile micro-organism wall mass flux whereas it enhances the nano-particle species local Sherwood number. Conclusions - Important transport characteristics are identified of relevance to real bioreactor nanotechnological systems not discussed in previous works. ADM is shown to achieve very rapid convergence and highly accurate solutions and shows excellent promise in simulating swirling multi-physical nano-bioconvection fluid dynamics problems. Furthermore, it provides an excellent complement to more general commercial computational fluid dynamics simulations.

Keywords: bio-nanofluids, rotating disk bioreactors, Von Karman swirling flow, numerical solutions

Procedia PDF Downloads 151