Search results for: conventional oven
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3622

Search results for: conventional oven

52 Enhancing Photocatalytic Activity of Oxygen Vacancies-Rich Tungsten Trioxide (WO₃) for Sustainable Energy Conversion and Water Purification

Authors: Satam Alotibi, Osama A. Hussein, Aziz H. Al-Shaibani, Nawaf A. Al-Aqeel, Abdellah Kaiba, Fatehia S. Alhakami, Mohammed Alyami, Talal F. Qahtan

Abstract:

The demand for sustainable and efficient energy conversion using solar energy has grown rapidly in recent years. In this pursuit, solar-to-chemical conversion has emerged as a promising approach, with oxygen vacancies-rich tungsten trioxide (WO₃) playing a crucial role. This study presents a method for synthesizing oxygen vacancies-rich WO3, resulting in a significant enhancement of its photocatalytic activity, representing a significant step towards sustainable energy solutions. Experimental results underscore the importance of oxygen vacancies in modifying the properties of WO₃. These vacancies introduce additional energy states within the material, leading to a reduction in the bandgap, increased light absorption, and acting as electron traps, thereby reducing emissions. Our focus lies in developing oxygen vacancies-rich WO₃, which demonstrates unparalleled potential for improved photocatalytic applications. The effectiveness of oxygen vacancies-rich WO₃ in solar-to-chemical conversion was showcased through rigorous assessments of its photocatalytic degradation performance. Sunlight irradiation was employed to evaluate the material's effectiveness in degrading organic pollutants in wastewater. The results unequivocally demonstrate the superior photocatalytic performance of oxygen vacancies-rich WO₃ compared to conventional WO₃ nanomaterials, establishing its efficacy in sustainable and efficient energy conversion. Furthermore, the synthesized material is utilized to fabricate films, which are subsequently employed in immobilized WO₃ and oxygen vacancies-rich WO₃ reactors for water purification under natural sunlight irradiation. This application offers a sustainable and efficient solution for water treatment, harnessing solar energy for effective decontamination. In addition to investigating the photocatalytic capabilities, we extensively analyze the structural and chemical properties of the synthesized material. The synthesis process involves in situ thermal reduction of WO₃ nano-powder in a nitrogen environment, meticulously monitored using thermogravimetric analysis (TGA) to ensure precise control over the synthesis of oxygen vacancies-rich WO₃. Comprehensive characterization techniques such as UV-Vis spectroscopy, X-ray photoelectron spectroscopy (XPS), FTIR, Raman spectroscopy, scanning electron microscopy (SEM), transmission electron microscopy (TEM), and selected area electron diffraction (SAED) provide deep insights into the material's optical properties, chemical composition, elemental states, structure, surface properties, and crystalline structure. This study represents a significant advancement in sustainable energy conversion through solar-to-chemical processes and water purification. By harnessing the unique properties of oxygen vacancies-rich WO₃, we not only enhance our understanding of energy conversion mechanisms but also pave the way for the development of highly efficient and environmentally friendly photocatalytic materials. The application of this material in water purification demonstrates its versatility and potential to address critical environmental challenges. These findings bring us closer to a sustainable energy future and cleaner water resources, laying a solid foundation for a more sustainable planet.

Keywords: sustainable energy conversion, solar-to-chemical conversion, oxygen vacancies-rich tungsten trioxide (WO₃), photocatalytic activity enhancement, water purification

Procedia PDF Downloads 32
51 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data

Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.

Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine

Procedia PDF Downloads 215
50 Improving the Accuracy of Stress Intensity Factors Obtained by Scaled Boundary Finite Element Method on Hybrid Quadtree Meshes

Authors: Adrian W. Egger, Savvas P. Triantafyllou, Eleni N. Chatzi

Abstract:

The scaled boundary finite element method (SBFEM) is a semi-analytical numerical method, which introduces a scaling center in each element’s domain, thus transitioning from a Cartesian reference frame to one resembling polar coordinates. Consequently, an analytical solution is achieved in radial direction, implying that only the boundary need be discretized. The only limitation imposed on the resulting polygonal elements is that they remain star-convex. Further arbitrary p- or h-refinement may be applied locally in a mesh. The polygonal nature of SBFEM elements has been exploited in quadtree meshes to alleviate all issues conventionally associated with hanging nodes. Furthermore, since in 2D this results in only 16 possible cell configurations, these are precomputed in order to accelerate the forward analysis significantly. Any cells, which are clipped to accommodate the domain geometry, must be computed conventionally. However, since SBFEM permits polygonal elements, significantly coarser meshes at comparable accuracy levels are obtained when compared with conventional quadtree analysis, further increasing the computational efficiency of this scheme. The generalized stress intensity factors (gSIFs) are computed by exploiting the semi-analytical solution in radial direction. This is initiated by placing the scaling center of the element containing the crack at the crack tip. Taking an analytical limit of this element’s stress field as it approaches the crack tip, delivers an expression for the singular stress field. By applying the problem specific boundary conditions, the geometry correction factor is obtained, and the gSIFs are then evaluated based on their formal definition. Since the SBFEM solution is constructed as a power series, not unlike mode superposition in FEM, the two modes contributing to the singular response of the element can be easily identified in post-processing. Compared to the extended finite element method (XFEM) this approach is highly convenient, since neither enrichment terms nor a priori knowledge of the singularity is required. Computation of the gSIFs by SBFEM permits exceptional accuracy, however, when combined with hybrid quadtrees employing linear elements, this does not always hold. Nevertheless, it has been shown that crack propagation schemes are highly effective even given very coarse discretization since they only rely on the ratio of mode one to mode two gSIFs. The absolute values of the gSIFs may still be subject to large errors. Hence, we propose a post-processing scheme, which minimizes the error resulting from the approximation space of the cracked element, thus limiting the error in the gSIFs to the discretization error of the quadtree mesh. This is achieved by h- and/or p-refinement of the cracked element, which elevates the amount of modes present in the solution. The resulting numerical description of the element is highly accurate, with the main error source now stemming from its boundary displacement solution. Numerical examples show that this post-processing procedure can significantly improve the accuracy of the computed gSIFs with negligible computational cost even on coarse meshes resulting from hybrid quadtrees.

Keywords: linear elastic fracture mechanics, generalized stress intensity factors, scaled finite element method, hybrid quadtrees

Procedia PDF Downloads 113
49 Will My Home Remain My Castle? Tenants’ Interview Topics regarding an Eco-Friendly Refurbishment Strategy in a Neighborhood in Germany

Authors: Karin Schakib-Ekbatan, Annette Roser

Abstract:

According to the Federal Government’s plans, the German building stock should be virtually climate neutral by 2050. Thus, the “EnEff.Gebäude.2050” funding initiative was launched, complementing the projects of the Energy Transition Construction research initiative. Beyond the construction and renovation of individual buildings, solutions must be found at the neighborhood level. The subject of the presented pilot project is a building ensemble from the Wilhelminian period in Munich, which is planned to be refurbished based on a socially compatible, energy-saving, innovative-technical modernization concept. The building ensemble, with about 200 apartments, is part of the building cooperative. To create an optimized network and possible synergies between researchers and projects of the funding initiative, a Scientific Accompanying Research was established for cross-project analyses of findings and results in order to identify further research needs and trends. Thus, the project is characterized by an interdisciplinary approach that combines constructional, technical, and socio-scientific expertise based on a participatory understanding of research by involving the tenants at an early stage. The research focus is on getting insights into the tenants’ comfort requirements, attitudes, and energy-related behaviour. Both qualitative and quantitative methods are applied based on the Technology-Acceptance-Model (TAM). The core of the refurbishment strategy is a wall heating system intended to replace conventional radiators. A wall heating provides comfortable and consistent radiant heat instead of convection heat, which often causes drafts and dust turbulence. Besides comfort and health, the advantage of wall heating systems is an energy-saving operation. All apartments would be supplied by a uniform basic temperature control system (around perceived room temperature of 18 °C resp. 64,4 °F), which could be adapted to individual preferences via individual heating options (e. g. infrared heating). The new heating system would affect the furnishing of the walls, in terms of not allowing the wall surface to be covered too much with cupboards or pictures. Measurements and simulations of the energy consumption of an installed wall heating system are currently being carried out in a show apartment in this neighborhood to investigate energy-related, economical aspects as well as thermal comfort. In March, interviews were conducted with a total of 12 people in 10 households. The interviews were analyzed by MAXQDA. The main issue of the interview was the fear of reduced self-efficacy within their own walls (not having sufficient individual control over the room temperature or being very limited in furnishing). Other issues concerned the impact that the construction works might have on their daily life, such as noise or dirt. Despite their basically positive attitude towards a climate-friendly refurbishment concept, tenants were very concerned about the further development of the project and they expressed a great need for information events. The results of the interviews will be used for project-internal discussions on technical and psychological aspects of the refurbishment strategy in order to design accompanying workshops with the tenants as well as to prepare a written survey involving all households of the neighbourhood.

Keywords: energy efficiency, interviews, participation, refurbishment, residential buildings

Procedia PDF Downloads 96
48 Teaching about Justice With Justice: How Using Experiential, Learner Centered Literacy Methodology Enhances Learning of Justice Related Competencies for Young Children

Authors: Bruna Azzari Puga, Richard Roe, Andre Pagani de Souza

Abstract:

abstract outlines a proposed study to examine how and to what extent interactive, experiential, learner centered methodology develops learning of basic civic and democratic competencies among young children. It stems from the Literacy and Law course taught at Georgetown University Law Center in Washington, DC, since 1998. Law students, trained in best literacy practices and legal cases affecting literacy development, read “law related” children’s books and engage in interactive and extension activities with emerging readers. The law students write a monthly journal describing their experiences and a final paper: a conventional paper or a children’s book illuminating some aspect of literacy and law. This proposal is based on the recent adaptation of Literacy and Law to Brazil at Mackenzie Presbyterian University in São Paulo in three forms: first, a course similar to the US model, often conducted jointly online with Brazilian and US law students; second, a similar course that combines readings of children’s literature with activity based learning, with law students from a satellite Mackenzie campus, for young children from a vulnerable community near the city; and third, a course taught by law students at the main Mackenzie campus for 4th grade students at the Mackenzie elementary school, that is wholly activity and discourse based. The workings and outcomes of these courses are well documented by photographs, reports, lesson plans, and law student journals. The authors, faculty who teach the above courses at Mackenzie and Georgetown, observe that literacy, broadly defined as cognitive and expressive development through reading and discourse-based activities, can be influential in developing democratic civic skills, identifiable by explicit civic competencies. For example, children experience justice in the classroom through cooperation, creativity, diversity, fairness, systemic thinking, and appreciation for rules and their purposes. Moreover, the learning of civic skills as well as the literacy skills is enhanced through interactive, learner centered practices in which the learners experience literacy and civic development. This study will develop rubrics for individual and classroom teaching and supervision by examining 1) the children’s books and students diaries of participating law students and 2) the collection of photos and videos of classroom activities, and 3) faculty and supervisor observations and reports. These rubrics, and the lesson plans and activities which are employed to advance the higher levels of performance outcomes, will be useful in training and supervision and in further replication and promotion of this form of teaching and learning. Examples of outcomes include helping, cooperating and participating; appreciation of viewpoint diversity; knowledge and utilization of democratic processes, including due process, advocacy, individual and shared decision making, consensus building, and voting; establishing and valuing appropriate rules and a reasoned approach to conflict resolution. In conclusion, further development and replication of the learner centered literacy and law practices outlined here can lead to improved qualities of democratic teaching and learning supporting mutual respect, positivity, deep learning, and the common good – foundation qualities of a sustainable world.

Keywords: democracy, law, learner-centered, literacy

Procedia PDF Downloads 86
47 Monte Carlo Risk Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5 cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbo machinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50 % cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low temperature heat exchanger LTHX (referred to by some authors as air pre-heater the mixed conductive membrane responsible for oxygen transfer and the high temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. This paper discusses techno-economic analysis of four possible layouts of the AZEP cycle. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout) – AZEP 85 % (85 % CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine– AZEP 85 % (85 % CO2 capture). This paper discusses Montecarlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gases, power plants

Procedia PDF Downloads 447
46 Rebuilding Beyond Bricks: The Environmental Psychological Foundations of Community Healing After the Lytton Creek Fire

Authors: Tugba Altin

Abstract:

In a time characterized by escalating climate change impacts, communities globally face extreme events with deep-reaching tangible and intangible consequences. At the intersection of these phenomena lies the profound impact on the cultural and emotional connections that individuals forge with their environments. This study casts a spotlight on the Lytton Creek Fire of 2021, showcasing it as an exemplar of both the visible destruction brought by such events and the more covert yet deeply impactful disturbances to place attachment (PA). Defined as the emotional and cognitive bond individuals form with their surroundings, PA is critical in comprehending how such catastrophic events reshape cultural identity and the bond with the land. Against the stark backdrop of the Lytton Creek Fire's devastation, the research seeks to unpack the multilayered dynamics of PA amidst the tangible wreckage and the intangible repercussions such as emotional distress and disrupted cultural landscapes. Delving deeper, it examines how affected populations renegotiate their affiliations with these drastically altered environments, grappling with both the tangible loss of their homes and the intangible challenges to solace, identity, and community cohesion. This exploration is instrumental in the broader climate change narrative, as it offers crucial insights into how these personal-place relationships can influence and shape climate adaptation and recovery strategies. Departing from traditional data collection methodologies, this study adopts an interpretive phenomenological approach enriched by hermeneutic insights and places the experiences of the Lytton community and its co-researchers at its core. Instead of conventional interviews, innovative methods like walking audio sessions and photo elicitation are employed. These techniques allow participants to immerse themselves back into the environment, reviving and voicing their memories and emotions in real-time. Walking audio captures reflections on spatial narratives after the trauma, whereas photo voices encapsulate the intangible emotions, presenting a visual representation of place-based experiences. Key findings emphasize the indispensability of addressing both the tangible and intangible traumas in community recovery efforts post-disaster. The profound changes to the cultural landscape and the subsequent shifts in PA underscore the need for holistic, culturally attuned, and emotionally insightful adaptation strategies. These strategies, rooted in the lived experiences and testimonies of the affected individuals, promise more resonant and effective recovery efforts. The research further contributes to climate change discourse, highlighting the intertwined pathways of tangible reconstruction and the essentiality of emotional and cultural rejuvenation. Furthermore, the use of participatory methodologies in this inquiry challenges traditional research paradigms, pointing to potential evolutionary shifts in qualitative research norms. Ultimately, this study underscores the need for a more integrative approach in addressing the aftermath of environmental disasters, ensuring that both physical and emotional rebuilding are given equal emphasis.

Keywords: place attachment, community recovery, disaster reponse, sensory responses, intangible traumas, visual methodologies

Procedia PDF Downloads 28
45 Effects of Combined Lewis Acid and Ultrasonic Pretreatment on the Physicochemical Properties of Heat-Treated Moso Bamboo

Authors: Tianfang Zhang, Luxi He, Zhengbin He, Songlin Yi

Abstract:

Moso bamboo is a common non-wood forest resource in Asia that is widely used in construction, furniture, and other fields. Influenced by the heterogeneous structure and various hygroscopic groups of bamboo, the deformation occurs as moisture absorption and desorption when the environment temperature and humidity conditions change. Thermal modification is a well-established commercial technology for improving the dimensional stability of bamboo. However, the higher energy consumption and carbon emissions limit its further development. Previous studies have indicated that inorganic salt-assisted thermal modification could lead to significant reductions in moisture absorption and energy consumption. Represented by metal chlorides, it could show Lewis acid properties when dissolved in water, generating metal ion ligand complexes. In addition, ultrasonic treatment, as an efficient and environmentally friendly physical treatment method, improved the accessibility of pretreatment chemical impregnation agents and intensified mass and heat transfer during reactions. To save energy and reduce deformation, this study elucidates the influence of zinc chloride-ultrasonic treatment on the physicochemical properties of heat-treated bamboo, and the details of the bamboo deformation mechanism with Lewis acid are explained. Three sets of parameters (inorganic salt concentration, ultrasonic frequency and heat treatment temperature) were designed, and an optimized process was proposed to clarify this scientific question, that is: 5% (w/w) zinc chloride solution, 40 kHz ultrasonic waves and heat treatment at 160 °C. The samples were characterized by different means to analyze changes in their macroscopic features, pore structure, chemical structure and chemical composition. The results suggested that the maximum weight loss rate was reduced by at least 19.75%. The maximum thermal degradation peak of hemicellulose was significantly shifted forward. The hygroscopicity was reduced by 10.15%, the relative crystallinity was increased by 4.4%, the surface contact angle was increased by 25.2%, and the color change was increased by 23.60 in the optimal condition. From the electron microscope observation, the treated surface became rougher, and cracks appeared in some weaker areas, accelerating starch loss and removing granular attachments around the pits. By ion diffusion, zinc ions diffused into hemicellulose and a partial amorphous region of cellulose. Parts of the cell wall structure were subjected to swelling and degradation, leading to the broken state of parenchyma cells. From the Raman spectrum, compared to conventional thermal modifications, hemicellulose thermal degradation and lignin migration is promoted by Lewis acid under dilute acid-thermal condition. As shown in this work, the combined Lewis acid and ultrasonic pretreatment as an environmentally friendly, safe, and efficient physic-chemical combined pretreatment method improved the dimensional stability of Moso bamboo and lowered the thermal degradation conditions. This method has great potential for development in the field of bamboo heat treatment, and it might provide some guidance for making dark bamboo flooring.

Keywords: Moso bamboo, Lewis acid, ultrasound, heat treatment

Procedia PDF Downloads 33
44 Sensorless Machine Parameter-Free Control of Doubly Fed Reluctance Wind Turbine Generator

Authors: Mohammad R. Aghakashkooli, Milutin G. Jovanovic

Abstract:

The brushless doubly-fed reluctance generator (BDFRG) is an emerging, medium-speed alternative to a conventional wound rotor slip-ring doubly-fed induction generator (DFIG) in wind energy conversion systems (WECS). It can provide competitive overall performance and similar low failure rates of a typically 30% rated back-to-back power electronics converter in 2:1 speed ranges but with the following important reliability and cost advantages over DFIG: the maintenance-free operation afforded by its brushless structure, 50% synchronous speed with the same number of rotor poles (allowing the use of a more compact, and more efficient two-stage gearbox instead of a vulnerable three-stage one), and superior grid integration properties including simpler protection for the low voltage ride through compliance of the fractional converter due to the comparatively higher leakage inductances and lower fault currents. Vector controlled pulse-width-modulated converters generally feature a much lower total harmonic distortion relative to hysteresis counterparts with variable switching rates and as such have been a predominant choice for BDFRG (and DFIG) wind turbines. Eliminating a shaft position sensor, which is often required for control implementation in this case, would be desirable to address the associated reliability issues. This fact has largely motivated the recent growing research of sensorless methods and developments of various rotor position and/or speed estimation techniques for this purpose. The main limitation of all the observer-based control approaches for grid-connected wind power applications of the BDFRG reported in the open literature is the requirement for pre-commissioning procedures and prior knowledge of the machine inductances, which are usually difficult to accurately identify by off-line testing. A model reference adaptive system (MRAS) based sensor-less vector control scheme to be presented will overcome this shortcoming. The true machine parameter independence of the proposed field-oriented algorithm, offering robust, inherently decoupled real and reactive power control of the grid-connected winding, is achieved by on-line estimation of the inductance ratio, the underlying rotor angular velocity and position MRAS observer being reliant upon. Such an observer configuration will be more practical to implement and clearly preferable to the existing machine parameter dependent solutions, and especially bearing in mind that with very little modifications it can be adapted for commercial DFIGs with immediately obvious further industrial benefits and prospects of this work. The excellent encoder-less controller performance with maximum power point tracking in the base speed region will be demonstrated by realistic simulation studies using large-scale BDFRG design data and verified by experimental results on a small laboratory prototype of the WECS emulation facility.

Keywords: brushless doubly fed reluctance generator, model reference adaptive system, sensorless vector control, wind energy conversion

Procedia PDF Downloads 34
43 Multibody Constrained Dynamics of Y-Method Installation System for a Large Scale Subsea Equipment

Authors: Naeem Ullah, Menglan Duan, Mac Darlington Uche Onuoha

Abstract:

The lowering of subsea equipment into the deep waters is a challenging job due to the harsh offshore environment. Many researchers have introduced various installation systems to deploy the payload safely into the deep oceans. In general practice, dual floating vessels are not employed owing to the prevalent safety risks and hazards caused by ever-increasing dynamical effects sourced by mutual interaction between the bodies. However, while keeping in the view of the optimal grounds, such as economical one, the Y-method, the two conventional tugboats supporting the equipment by the two independent strands connected to a tri-plate above the equipment, has been employed to study multibody dynamics of the dual barge lifting operations. In this study, the two tugboats and the suspended payload (Y-method) are deployed for the lowering of subsea equipment into the deep waters as a multibody dynamic system. The two-wire ropes are used for the lifting and installation operation by this Y-method installation system. 6-dof (degree of freedom) for each body are considered to establish coupled 18-dof multibody model by embedding technique or velocity transformation technique. The fundamental and prompt advantage of this technique is that the constraint forces can be eliminated directly, and no extra computational effort is required for the elimination of the constraint forces. The inertial frame of reference is taken at the surface of the water as the time-independent frame of reference, and the floating frames of reference are introduced in each body as the time-dependent frames of reference in order to formulate the velocity transformation matrix. The local transformation of the generalized coordinates to the inertial frame of reference is executed by applying the Euler Angle approach. The spherical joints are articulated amongst the multibody as the kinematic joints. The hydrodynamic force, the two-strand forces, the hydrostatic force, and the mooring forces are taken into consideration as the external forces. The radiation force of the hydrodynamic force is obtained by employing the Cummins equation. The wave exciting part of the hydrodynamic force is obtained by using force response amplitude operators (RAOs) that are obtained by the commercial solver ‘OpenFOAM’. The strand force is obtained by considering the wire rope as an elastic spring. The nonlinear hydrostatic force is obtained by the pressure integration technique at each time step of the wave movement. The mooring forces are evaluated by using Faltinsen analytical approach. ‘The Runge Kutta Method’ of Fourth-Order is employed to evaluate the coupled equations of motion obtained for 18-dof multibody model. The results are correlated with the simulated Orcaflex Model. Moreover, the results from Orcaflex Model are compared with the MOSES Model from previous studies. The MBDS of single barge lifting operation from the former studies are compared with the MBDS of the established dual barge lifting operation. The dynamics of the dual barge lifting operation are found larger in magnitude as compared to the single barge lifting operation. It is noticed that the traction at the top connection point of the cable decreases with the increase in the length, and it becomes almost constant after passing through the splash zone.

Keywords: dual barge lifting operation, Y-method, multibody dynamics, shipbuilding, installation of subsea equipment, shipbuilding

Procedia PDF Downloads 171
42 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 55
41 Deciphering Information Quality: Unraveling the Impact of Information Distortion in the UK Aerospace Supply Chains

Authors: Jing Jin

Abstract:

The incorporation of artificial intelligence (AI) and machine learning (ML) in aircraft manufacturing and aerospace supply chains leads to the generation of a substantial amount of data among various tiers of suppliers and OEMs. Identifying the high-quality information challenges decision-makers. The application of AI/ML models necessitates access to 'high-quality' information to yield desired outputs. However, the process of information sharing introduces complexities, including distortion through various communication channels and biases introduced by both human and AI entities. This phenomenon significantly influences the quality of information, impacting decision-makers engaged in configuring supply chain systems. Traditionally, distorted information is categorized as 'low-quality'; however, this study challenges this perception, positing that distorted information, contributing to stakeholder goals, can be deemed high-quality within supply chains. The main aim of this study is to identify and evaluate the dimensions of information quality crucial to the UK aerospace supply chain. Guided by a central research question, "What information quality dimensions are considered when defining information quality in the UK aerospace supply chain?" the study delves into the intricate dynamics of information quality in the aerospace industry. Additionally, the research explores the nuanced impact of information distortion on stakeholders' decision-making processes, addressing the question, "How does the information distortion phenomenon influence stakeholders’ decisions regarding information quality in the UK aerospace supply chain system?" This study employs deductive methodologies rooted in positivism, utilizing a cross-sectional approach and a mono-quantitative method -a questionnaire survey. Data is systematically collected from diverse tiers of supply chain stakeholders, encompassing end-customers, OEMs, Tier 0.5, Tier 1, and Tier 2 suppliers. Employing robust statistical data analysis methods, including mean values, mode values, standard deviation, one-way analysis of variance (ANOVA), and Pearson’s correlation analysis, the study interprets and extracts meaningful insights from the gathered data. Initial analyses challenge conventional notions, revealing that information distortion positively influences the definition of information quality, disrupting the established perception of distorted information as inherently low-quality. Further exploration through correlation analysis unveils the varied perspectives of different stakeholder tiers on the impact of information distortion on specific information quality dimensions. For instance, Tier 2 suppliers demonstrate strong positive correlations between information distortion and dimensions like access security, accuracy, interpretability, and timeliness. Conversely, Tier 1 suppliers emphasise strong negative influences on the security of accessing information and negligible impact on information timeliness. Tier 0.5 suppliers showcase very strong positive correlations with dimensions like conciseness and completeness, while OEMs exhibit limited interest in considering information distortion within the supply chain. Introducing social network analysis (SNA) provides a structural understanding of the relationships between information distortion and quality dimensions. The moderately high density of ‘information distortion-by-information quality’ underscores the interconnected nature of these factors. In conclusion, this study offers a nuanced exploration of information quality dimensions in the UK aerospace supply chain, highlighting the significance of individual perspectives across different tiers. The positive influence of information distortion challenges prevailing assumptions, fostering a more nuanced understanding of information's role in the Industry 4.0 landscape.

Keywords: information distortion, information quality, supply chain configuration, UK aerospace industry

Procedia PDF Downloads 20
40 Ecotoxicological Test-Battery for Efficiency Assessment of TiO2 Assisted Photodegradation of Emerging Micropolluants

Authors: Ildiko Fekete-Kertesz, Jade Chaker, Sylvain Berthelot, Viktoria Feigl, Monika Molnar, Lidia Favier

Abstract:

There has been growing concern about emerging micropollutants in recent years, because of the possible environmental and health risk posed by these substances, which are released into the environment as a consequence of anthropogenic activities. Among them pharmaceuticals are currently not considered under water quality regulations; however, their potential effect on the environment have become more frequent in recent years. Due to the fact that these compounds can be detected in natural water matrices, it can be concluded, that the currently applied water treatment processes are not efficient enough for their effective elimination. To date, advanced oxidation processes (AOPs) are considered as highly competitive water treatment technologies for the removal of those organic micropollutants not treatable by conventional techniques due to their high chemical stability and/or low biodegradability. AOPs such as (photo)chemical oxidation and heterogeneous photocatalysis have proven their potential in degrading harmful organic compounds from aqueous matrices. However, some of these technologies generate reaction by-products, which can even be more toxic to aquatic organisms than the parent compounds. Thus, target compound removal does not necessarily result in the removal of toxicity. Therefore, to evaluate process efficiency the determination of the toxicity and ecotoxicity of the reaction intermediates is crucial to estimate the environmental risk of such techniques. In this context, the present study investigates the effectiveness of TiO2 assisted photodegradation for the removal of emerging water contaminants. Two drugs named losartan (used in high blood pressure medication) and levetiracetam (used to treat epilepsy) were considered in this work. The photocatalytic reactions were carried out with a commercial catalyst usually employed in photocatalysis. Moreover, the toxicity of the by-products generated during the process was assessed with various ecotoxicological methods applying aquatic test organisms from different trophic levels. A series of experiments were performed to evaluate the toxicity of untreated and treated solutions applying the Aliivibrio fischeri bioluminescence inhibition test, the Tetrahymena pyriformis proliferation inhibition test, the Daphnia magna lethality and immobilization tests and the Lemna minor growth inhibition test. The applied ecotoxicological methodology indicated sensitively the toxic effects of the treated and untreated water samples, hence the applied test battery is suitable for the ecotoxicological characterization of TiO2 based photocatalytic water treatment technologies and the indication of the formation of toxic by-products from the parent chemical compounds. Obtained results clearly showed that the TiO2 assisted photodegradation was more efficient in the elimination of losartan than levetiracetam. It was also observed that the treated levetiracetam solutions had more severe effect on the applied test organisms. A possible explanation would be the production of levetiracetam by-products, which are more toxic than the parent compound. The increased toxicity and the risk of formation of toxic metabolites represent one possible limitation to the implementation of photocatalytic treatment using TiO2 for the removal of losartan and levetiracetam. Our results proved that, the battery of ecotoxicity tests used in this work can be a promising investigation tool for the environmental risk assessment of photocatalytic processes.

Keywords: aquatic micropollutants, ecotoxicology, nano titanium dioxide, photocatalysis, water treatment

Procedia PDF Downloads 163
39 Expanded Polyurethane Foams and Waterborne-Polyurethanes from Vegetable Oils

Authors: A.Cifarelli, L. Boggioni, F. Bertini, L. Magon, M. Pitalieri, S. Losio

Abstract:

Nowadays, the growing environmental awareness and the dwindling of fossil resources stimulate the polyurethane (PU) industry towards renewable polymers with low carbon footprint to replace the feed stocks from petroleum sources. The main challenge in this field consists in replacing high-performance products from fossil-fuel with novel synthetic polymers derived from 'green monomers'. The bio-polyols from plant oils have attracted significant industrial interest and major attention in scientific research due to their availability and biodegradability. Triglycerides rich in unsaturated fatty acids, such as soybean oil (SBO) and linseed oil (ELO), are particularly interesting because their structures and functionalities are tunable by chemical modification in order to obtain polymeric materials with expected final properties. Unfortunately, their use is still limited for processing or performance problems because a high functionality, as well as OH number of the polyols will result in an increase in cross-linking densities of the resulting PUs. The main aim of this study is to evaluate soy and linseed-based polyols as precursors to prepare prepolymers for the production of polyurethane foams (PUFs) or waterborne-polyurethanes (WPU) used as coatings. An effective reaction route is employed for its simplicity and economic impact. Indeed, bio-polyols were synthesized by a two-step method: epoxidation of the double bonds in vegetable oils and solvent-free ring-opening reaction of the oxirane with organic acids. No organic solvents have been used. Acids with different moieties (aliphatic or aromatics) and different length of hydrocarbon backbones can be used to customize polyols with different functionalities. The ring-opening reaction requires a fine tuning of the experimental conditions (time, temperature, molar ratio of carboxylic acid and epoxy group) to control the acidity value of end-product as well as the amount of residual starting materials. Besides, a Lewis base catalyst is used to favor the ring opening reaction of internal epoxy groups of the epoxidized oil and minimize the formation of cross-linked structures in order to achieve less viscous and more processable polyols with narrower polydispersity indices (molecular weight lower than 2000 g/mol⁻¹). The functionality of optimized polyols is tuned from 2 to 4 per molecule. The obtained polyols are characterized by means of GPC, NMR (¹H, ¹³C) and FT-IR spectroscopy to evaluate molecular masses, molecular mass distributions, microstructures and linkage pathways. Several polyurethane foams have been prepared by prepolymer method blending conventional synthetic polyols with new bio-polyols from soybean and linseed oils without using organic solvents. The compatibility of such bio-polyols with commercial polyols and diisocyanates is demonstrated. The influence of the bio-polyols on the foam morphology (cellular structure, interconnectivity), density, mechanical and thermal properties has been studied. Moreover, bio-based WPUs have been synthesized by well-established processing technology. In this synthesis, a portion of commercial polyols is substituted by the new bio-polyols and the properties of the coatings on leather substrates have been evaluated to determine coating hardness, abrasion resistance, impact resistance, gloss, chemical resistance, flammability, durability, and adhesive strength.

Keywords: bio-polyols, polyurethane foams, solvent free synthesis, waterborne-polyurethanes

Procedia PDF Downloads 97
38 Thermally Conductive Polymer Nanocomposites Based on Graphene-Related Materials

Authors: Alberto Fina, Samuele Colonna, Maria del Mar Bernal, Orietta Monticelli, Mauro Tortello, Renato Gonnelli, Julio Gomez, Chiara Novara, Guido Saracco

Abstract:

Thermally conductive polymer nanocomposites are of high interest for several applications including low-temperature heat recovery, heat exchangers in a corrosive environment and heat management in electronics and flexible electronics. In this paper, the preparation of thermally conductive nanocomposites exploiting graphene-related materials is addressed, along with their thermal characterization. In particular, correlations between 1- chemical and physical features of the nanoflakes and 2- processing conditions with the heat conduction properties of nanocomposites is studied. Polymers are heat insulators; therefore, the inclusion of conductive particles is the typical solution to obtain a sufficient thermal conductivity. In addition to traditional microparticles such as graphite and ceramics, several nanoparticles have been proposed, including carbon nanotubes and graphene, for the use in polymer nanocomposites. Indeed, thermal conductivities for both carbon nanotubes and graphenes were reported in the wide range of about 1500 to 6000 W/mK, despite such property may decrease dramatically as a function of the size, number of layers, the density of topological defects, re-hybridization defects as well as on the presence of impurities. Different synthetic techniques have been developed, including mechanical cleavage of graphite, epitaxial growth on SiC, chemical vapor deposition, and liquid phase exfoliation. However, the industrial scale-up of graphene, defined as an individual, single-atom-thick sheet of hexagonally arranged sp2-bonded carbons still remains very challenging. For large scale bulk applications in polymer nanocomposites, some graphene-related materials such as multilayer graphenes (MLG), reduced graphene oxide (rGO) or graphite nanoplatelets (GNP) are currently the most interesting graphene-based materials. In this paper, different types of graphene-related materials were characterized for their chemical/physical as well as for thermal properties of individual flakes. Two selected rGOs were annealed at 1700°C in vacuum for 1 h to reduce defectiveness of the carbon structure. Thermal conductivity increase of individual GNP with annealing was assessed via scanning thermal microscopy. Graphene nano papers were prepared from both conventional RGO and annealed RGO flakes. Characterization of the nanopapers evidenced a five-fold increase in the thermal diffusivity on the nano paper plane for annealed nanoflakes, compared to pristine ones, demonstrating the importance of structural defectiveness reduction to maximize the heat dissipation performance. Both pristine and annealed RGO were used to prepare polymer nanocomposites, by melt reactive extrusion. Thermal conductivity showed two- to three-fold increase in the thermal conductivity of the nanocomposite was observed for high temperature treated RGO compared to untreated RGO, evidencing the importance of using low defectivity nanoflakes. Furthermore, the study of different processing paremeters (time, temperature, shear rate) during the preparation of poly (butylene terephthalate) nanocomposites evidenced a clear correlation with the dispersion and fragmentation of the GNP nanoflakes; which in turn affected the thermal conductivity performance. Thermal conductivity of about 1.7 W/mK, i.e. one order of magnitude higher than for pristine polymer, was obtained with 10%wt of annealed GNPs, which is in line with state of the art nanocomposites prepared by more complex and less upscalable in situ polymerization processes.

Keywords: graphene, graphene-related materials, scanning thermal microscopy, thermally conductive polymer nanocomposites

Procedia PDF Downloads 244
37 Italian Speech Vowels Landmark Detection through the Legacy Tool 'xkl' with Integration of Combined CNNs and RNNs

Authors: Kaleem Kashif, Tayyaba Anam, Yizhi Wu

Abstract:

This paper introduces a methodology for advancing Italian speech vowels landmark detection within the distinctive feature-based speech recognition domain. Leveraging the legacy tool 'xkl' by integrating combined convolutional neural networks (CNNs) and recurrent neural networks (RNNs), the study presents a comprehensive enhancement to the 'xkl' legacy software. This integration incorporates re-assigned spectrogram methodologies, enabling meticulous acoustic analysis. Simultaneously, our proposed model, integrating combined CNNs and RNNs, demonstrates unprecedented precision and robustness in landmark detection. The augmentation of re-assigned spectrogram fusion within the 'xkl' software signifies a meticulous advancement, particularly enhancing precision related to vowel formant estimation. This augmentation catalyzes unparalleled accuracy in landmark detection, resulting in a substantial performance leap compared to conventional methods. The proposed model emerges as a state-of-the-art solution in the distinctive feature-based speech recognition systems domain. In the realm of deep learning, a synergistic integration of combined CNNs and RNNs is introduced, endowed with specialized temporal embeddings, harnessing self-attention mechanisms, and positional embeddings. The proposed model allows it to excel in capturing intricate dependencies within Italian speech vowels, rendering it highly adaptable and sophisticated in the distinctive feature domain. Furthermore, our advanced temporal modeling approach employs Bayesian temporal encoding, refining the measurement of inter-landmark intervals. Comparative analysis against state-of-the-art models reveals a substantial improvement in accuracy, highlighting the robustness and efficacy of the proposed methodology. Upon rigorous testing on a database (LaMIT) speech recorded in a silent room by four Italian native speakers, the landmark detector demonstrates exceptional performance, achieving a 95% true detection rate and a 10% false detection rate. A majority of missed landmarks were observed in proximity to reduced vowels. These promising results underscore the robust identifiability of landmarks within the speech waveform, establishing the feasibility of employing a landmark detector as a front end in a speech recognition system. The synergistic integration of re-assigned spectrogram fusion, CNNs, RNNs, and Bayesian temporal encoding not only signifies a significant advancement in Italian speech vowels landmark detection but also positions the proposed model as a leader in the field. The model offers distinct advantages, including unparalleled accuracy, adaptability, and sophistication, marking a milestone in the intersection of deep learning and distinctive feature-based speech recognition. This work contributes to the broader scientific community by presenting a methodologically rigorous framework for enhancing landmark detection accuracy in Italian speech vowels. The integration of cutting-edge techniques establishes a foundation for future advancements in speech signal processing, emphasizing the potential of the proposed model in practical applications across various domains requiring robust speech recognition systems.

Keywords: landmark detection, acoustic analysis, convolutional neural network, recurrent neural network

Procedia PDF Downloads 12
36 Well Inventory Data Entry: Utilization of Developed Technologies to Progress the Integrated Asset Plan

Authors: Danah Al-Selahi, Sulaiman Al-Ghunaim, Bashayer Sadiq, Fatma Al-Otaibi, Ali Ameen

Abstract:

In light of recent changes affecting the Oil & Gas Industry, optimization measures have become imperative for all companies globally, including Kuwait Oil Company (KOC). To keep abreast of the dynamic market, a detailed Integrated Asset Plan (IAP) was developed to drive optimization across the organization, which was facilitated through the in-house developed software “Well Inventory Data Entry” (WIDE). This comprehensive and integrated approach enabled centralization of all planned asset components for better well planning, enhancement of performance, and to facilitate continuous improvement through performance tracking and midterm forecasting. Traditionally, this was hard to achieve as, in the past, various legacy methods were used. This paper briefly describes the methods successfully adopted to meet the company’s objective. IAPs were initially designed using computerized spreadsheets. However, as data captured became more complex and the number of stakeholders requiring and updating this information grew, the need to automate the conventional spreadsheets became apparent. WIDE, existing in other aspects of the company (namely, the Workover Optimization project), was utilized to meet the dynamic requirements of the IAP cycle. With the growth of extensive features to enhance the planning process, the tool evolved into a centralized data-hub for all asset-groups and technical support functions to analyze and infer from, leading WIDE to become the reference two-year operational plan for the entire company. To achieve WIDE’s goal of operational efficiency, asset-groups continuously add their parameters in a series of predefined workflows that enable the creation of a structured process which allows risk factors to be flagged and helps mitigation of the same. This tool dictates assigned responsibilities for all stakeholders in a method that enables continuous updates for daily performance measures and operational use. The reliable availability of WIDE, combined with its user-friendliness and easy accessibility, created a platform of cross-functionality amongst all asset-groups and technical support groups to update contents of their respective planning parameters. The home-grown entity was implemented across the entire company and tailored to feed in internal processes of several stakeholders across the company. Furthermore, the implementation of change management and root cause analysis techniques captured the dysfunctionality of previous plans, which in turn resulted in the improvement of already existing mechanisms of planning within the IAP. The detailed elucidation of the 2 year plan flagged any upcoming risks and shortfalls foreseen in the plan. All results were translated into a series of developments that propelled the tool’s capabilities beyond planning and into operations (such as Asset Production Forecasts, setting KPIs, and estimating operational needs). This process exemplifies the ability and reach of applying advanced development techniques to seamlessly integrated the planning parameters of various assets and technical support groups. These techniques enables the enhancement of integrating planning data workflows that ultimately lay the founding plans towards an epoch of accuracy and reliability. As such, benchmarks of establishing a set of standard goals are created to ensure the constant improvement of the efficiency of the entire planning and operational structure.

Keywords: automation, integration, value, communication

Procedia PDF Downloads 113
35 Pulmonary Disease Identification Using Machine Learning and Deep Learning Techniques

Authors: Chandu Rathnayake, Isuri Anuradha

Abstract:

Early detection and accurate diagnosis of lung diseases play a crucial role in improving patient prognosis. However, conventional diagnostic methods heavily rely on subjective symptom assessments and medical imaging, often causing delays in diagnosis and treatment. To overcome this challenge, we propose a novel lung disease prediction system that integrates patient symptoms and X-ray images to provide a comprehensive and reliable diagnosis.In this project, develop a mobile application specifically designed for detecting lung diseases. Our application leverages both patient symptoms and X-ray images to facilitate diagnosis. By combining these two sources of information, our application delivers a more accurate and comprehensive assessment of the patient's condition, minimizing the risk of misdiagnosis. Our primary aim is to create a user-friendly and accessible tool, particularly important given the current circumstances where many patients face limitations in visiting healthcare facilities. To achieve this, we employ several state-of-the-art algorithms. Firstly, the Decision Tree algorithm is utilized for efficient symptom-based classification. It analyzes patient symptoms and creates a tree-like model to predict the presence of specific lung diseases. Secondly, we employ the Random Forest algorithm, which enhances predictive power by aggregating multiple decision trees. This ensemble technique improves the accuracy and robustness of the diagnosis. Furthermore, we incorporate a deep learning model using Convolutional Neural Network (CNN) with the RestNet50 pre-trained model. CNNs are well-suited for image analysis and feature extraction. By training CNN on a large dataset of X-ray images, it learns to identify patterns and features indicative of lung diseases. The RestNet50 architecture, known for its excellent performance in image recognition tasks, enhances the efficiency and accuracy of our deep learning model. By combining the outputs of the decision tree-based algorithms and the deep learning model, our mobile application generates a comprehensive lung disease prediction. The application provides users with an intuitive interface to input their symptoms and upload X-ray images for analysis. The prediction generated by the system offers valuable insights into the likelihood of various lung diseases, enabling individuals to take appropriate actions and seek timely medical attention. Our proposed mobile application has significant potential to address the rising prevalence of lung diseases, particularly among young individuals with smoking addictions. By providing a quick and user-friendly approach to assessing lung health, our application empowers individuals to monitor their well-being conveniently. This solution also offers immense value in the context of limited access to healthcare facilities, enabling timely detection and intervention. In conclusion, our research presents a comprehensive lung disease prediction system that combines patient symptoms and X-ray images using advanced algorithms. By developing a mobile application, we provide an accessible tool for individuals to assess their lung health conveniently. This solution has the potential to make a significant impact on the early detection and management of lung diseases, benefiting both patients and healthcare providers.

Keywords: CNN, random forest, decision tree, machine learning, deep learning

Procedia PDF Downloads 49
34 Bicycle Tourism and Sharing Economy (C2C-Tourism): Analysis of the Reciprocity Behavior in the Case of Warmshowers

Authors: Jana Heimel, Franziska Drescher, Lauren Ugur, Graciela Kuchle

Abstract:

Sharing platforms are a widely investigated field. However, there is a research gap with a lack of focus on ‘real’ (non-profit-orientated) sharing platforms. The research project addresses this gap by conducting an empirical study on a private peer-to-peer (P2P) network to investigate cooperative behavior from a socio-psychological perspective. In recent years the conversion from possession to accessing is increasingly influencing different sectors, particularly the traveling industry. The number of people participating in hospitality exchange platforms like Airbnb, Couchsurfing, and Warmshowers (WS) is rapidly growing. WS is an increasingly popular online community that is linking cycling tourists and locals. It builds on the idea of the “sharing economy” as a not-for-profit hospitality network for bicycle tourists. Hosts not only provide a sleeping berth and warm shower free of charge but also offer additional services to their guests, such as cooking and washing clothes for them. According to previous studies, they are motivated by the idea of promoting cultural experience and forming new friendships. Trust and reciprocity are supposed to play major roles in the success of such platforms. The objective of this research project is to analyze the reciprocity behavior within the WS community. Reciprocity is the act of giving and taking among each other. Individuals feel obligated to return a favor and often expect to increase their own chances of receiving future benefits for themselves. Consequently, the drivers that incite giving and taking, as well as the motivation for hosts and guests, are examined. Thus, the project investigates a particular tourism offer that contributes to sustainable tourism by analyzing P2P resp. cyclist-to-cyclist, C2C) tourism. C2C tourism is characterized by special hospitality and generosity. To find out what motivations drive the hosts and which determinants drive the sharing cycling economy, an empirical study has been conducted globally through an online survey. The data was gathered through the WS community and comprised responses from more than 10,000 cyclists around the globe. Next to general information mostly comprising quantitative data on bicycle tourism (year/tour distance, duration and budget), qualitative information on traveling with WS as well as hosting was collected. The most important motivations for a traveler is to explore the local culture, to save money, and to make friends. The main reasons to host a guest are to promote the use of bicycles and to make friends, but also to give back and pay forward. WS members prefer to stay with/host cyclists. The results indicate that C2C tourists share homogenous characteristics and a similar philosophy, which is crucial for building mutual trust. Members of WS are generally extremely trustful. The study promotes an ecological form of tourism by combining sustainability, regionality, health, experience and the local communities' cultures. The empirical evidence found and analyzed, despite evident limitations, enabled us to shed light, especially on the issue of motivations and social capital, and on the functioning of ‘sharing’ platforms. Final research results are intended to promote C2C tourism around the globe to further replace conventional by sustainable tourism.

Keywords: bicycle tourism, homogeneity, reciprocity, sharing economy, trust

Procedia PDF Downloads 89
33 Shakespeare's Hamlet in Ballet: Transformation of an Archival Recording of a Neoclassical Ballet Performance into a Contemporary Transmodern Dance Video Applying Postmodern Concepts and Techniques

Authors: Svebor Secak

Abstract:

This four-year artistic research project hosted by the University of New England, Australia has set the goal to experiment with non-conventional ways of presenting a language-based narrative in dance using insights of recent theoretical writing on performance, addressing the research question: How to transform an archival recording of a neoclassical ballet performance into a new artistic dance video by implementing postmodern philosophical concepts? The Creative Practice component takes the form of a dance video Hamlet Revisited which is a reworking of the archival recording of the neoclassical ballet Hamlet, augmented by new material, produced using resources, technicians and dancers of the Croatian National Theatre in Zagreb. The methodology for the creation of Hamlet Revisited consisted of extensive field and desk research after which three dancers were shown the recording of original Hamlet and then created their artistic response to it based on their reception and appreciation of it. The dancers responded differently, based upon their diverse dancing backgrounds and life experiences. They began in the role of the audience observing video of the original ballet and transformed into the role of the choreographer-performer. Their newly recorded material was edited and juxtaposed with the archival recording of Hamlet and other relevant footage, allowing for postmodern features such as aleatoric content, synchronicity, eclecticism and serendipity, that way establishing communication on a receptive reader-response basis, thus blending the roles of the choreographer, performer and spectator, creating an original work of art whose significance lies in the relationship and communication between styles, old and new choreographic approaches, artists and audiences and the transformation of their traditional roles and relationships. In editing and collating, the following techniques were used with the intention to avoid the singular narrative: fragmentation, repetition, reverse-motion, multiplication of images, split screen, overlaying X-rays, image scratching, slow-motion, freeze-frame and simultaneity. Key postmodern concepts considered were: deconstruction, diffuse authorship, supplementation, simulacrum, self-reflexivity, questioning the role of the author, intertextuality and incredulity toward grand narratives - departing from the original story, thus personalising its ontological themes. From a broad brush of diverse concepts and techniques applied in an almost prescriptive manner, the project focuses on intertextuality that proves to be valid on at least two levels. The first is the possibility of a more objective analysis in combination with a semiotic structuralist approach moving from strict relationships between signs to a multiplication of signifiers, considering the dance text as an open construction, containing the elusive and enigmatic quality of art that leaves the interpretive position open. The second one is the creation of the new work where the author functions as the editor, aware and conscious of the interplay of disparate texts and their sources which co-act in the mind during the creative process. It is argued here that the eclectic combination of the old and new material through constant oscillations of different discourses upon the same topic resulted in a transmodern integrationist recent work of art that might be applied as a model for reconsidering existing choreographic creations.

Keywords: Ballet Hamlet, intertextuality, transformation, transmodern dance video

Procedia PDF Downloads 221
32 Simulation of Multistage Extraction Process of Co-Ni Separation Using Ionic Liquids

Authors: Hongyan Chen, Megan Jobson, Andrew J. Masters, Maria Gonzalez-Miquel, Simon Halstead, Mayri Diaz de Rienzo

Abstract:

Ionic liquids offer excellent advantages over conventional solvents for industrial extraction of metals from aqueous solutions, where such extraction processes bring opportunities for recovery, reuse, and recycling of valuable resources and more sustainable production pathways. Recent research on the use of ionic liquids for extraction confirms their high selectivity and low volatility, but there is relatively little focus on how their properties can be best exploited in practice. This work addresses gaps in research on process modelling and simulation, to support development, design, and optimisation of these processes, focusing on the separation of the highly similar transition metals, cobalt, and nickel. The study exploits published experimental results, as well as new experimental results, relating to the separation of Co and Ni using trihexyl (tetradecyl) phosphonium chloride. This extraction agent is attractive because it is cheaper, more stable and less toxic than fluorinated hydrophobic ionic liquids. This process modelling work concerns selection and/or development of suitable models for the physical properties, distribution coefficients, for mass transfer phenomena, of the extractor unit and of the multi-stage extraction flowsheet. The distribution coefficient model for cobalt and HCl represents an anion exchange mechanism, supported by the literature and COSMO-RS calculations. Parameters of the distribution coefficient models are estimated by fitting the model to published experimental extraction equilibrium results. The mass transfer model applies Newman’s hard sphere model. Diffusion coefficients in the aqueous phase are obtained from the literature, while diffusion coefficients in the ionic liquid phase are fitted to dynamic experimental results. The mass transfer area is calculated from the surface to mean diameter of liquid droplets of the dispersed phase, estimated from the Weber number inside the extractor. New experiments measure the interfacial tension between the aqueous and ionic phases. The empirical models for predicting the density and viscosity of solutions under different metal loadings are also fitted to new experimental data. The extractor is modelled as a continuous stirred tank reactor with mass transfer between the two phases and perfect phase separation of the outlet flows. A multistage separation flowsheet simulation is set up to replicate a published experiment and compare model predictions with the experimental results. This simulation model is implemented in gPROMS software for dynamic process simulation. The results of single stage and multi-stage flowsheet simulations are shown to be in good agreement with the published experimental results. The estimated diffusion coefficient of cobalt in the ionic liquid phase is in reasonable agreement with published data for the diffusion coefficients of various metals in this ionic liquid. A sensitivity study with this simulation model demonstrates the usefulness of the models for process design. The simulation approach has potential to be extended to account for other metals, acids, and solvents for process development, design, and optimisation of extraction processes applying ionic liquids for metals separations, although a lack of experimental data is currently limiting the accuracy of models within the whole framework. Future work will focus on process development more generally and on extractive separation of rare earths using ionic liquids.

Keywords: distribution coefficient, mass transfer, COSMO-RS, flowsheet simulation, phosphonium

Procedia PDF Downloads 152
31 Enabling Wire Arc Additive Manufacturing in Aircraft Landing Gear Production and Its Benefits

Authors: Jun Wang, Chenglei Diao, Emanuele Pagone, Jialuo Ding, Stewart Williams

Abstract:

As a crucial component in aircraft, landing gear systems are responsible for supporting the plane during parking, taxiing, takeoff, and landing. Given the need for high load-bearing capacity over extended periods, 300M ultra-high strength steel (UHSS) is often the material of choice for crafting these systems due to its exceptional strength, toughness, and fatigue resistance. In the quest for cost-effective and sustainable manufacturing solutions, Wire Arc Additive Manufacturing (WAAM) emerges as a promising alternative for fabricating 300M UHSS landing gears. This is due to its advantages in near-net-shape forming of large components, cost-efficiency, and reduced lead times. Cranfield University has conducted an extensive preliminary study on WAAM 300M UHSS, covering feature deposition, interface analysis, and post-heat treatment. Both Gas Metal Arc (GMA) and Plasma Transferred Arc (PTA)-based WAAM methods were explored, revealing their feasibility for defect-free manufacturing. However, as-deposited 300M features showed lower strength but higher ductility compared to their forged counterparts. Subsequent post-heat treatments were effective in normalising the microstructure and mechanical properties, meeting qualification standards. A 300M UHSS landing gear demonstrator was successfully created using PTA-based WAAM, showcasing the method's precision and cost-effectiveness. The demonstrator, measuring Ф200mm x 700mm, was completed in 16 hours, using 7 kg of material at a deposition rate of 1.3kg/hr. This resulted in a significant reduction in the Buy-to-Fly (BTF) ratio compared to traditional manufacturing methods, further validating WAAM's potential for this application. A "cradle-to-gate" environmental impact assessment, which considers the cumulative effects from raw material extraction to customer shipment, has revealed promising outcomes. Utilising Wire Arc Additive Manufacturing (WAAM) for landing gear components significantly reduces the need for raw material extraction and refinement compared to traditional subtractive methods. This, in turn, lessens the burden on subsequent manufacturing processes, including heat treatment, machining, and transportation. Our estimates indicate that the carbon footprint of the component could be halved when switching from traditional machining to WAAM. Similar reductions are observed in embodied energy consumption and other environmental impact indicators, such as emissions to air, water, and land. Additionally, WAAM offers the unique advantage of part repair by redepositing only the necessary material, a capability not available through conventional methods. Our research shows that WAAM-based repairs can drastically reduce environmental impact, even when accounting for additional transportation for repairs. Consequently, WAAM emerges as a pivotal technology for reducing environmental impact in manufacturing, aiding the industry in its crucial and ambitious journey towards Net Zero. This study paves the way for transformative benefits across the aerospace industry, as we integrate manufacturing into a hybrid solution that offers substantial savings and access to more sustainable technologies for critical component production.

Keywords: WAAM, aircraft landing gear, microstructure, mechanical performance, life cycle assessment

Procedia PDF Downloads 120
30 Evaluation of Antimicrobial Properties of Lactic Acid Bacteria of Enterococcus Genus

Authors: Kristina Karapetyan, Flora Tkhruni, Tsovinar Balabekyan, Arevik Israyelyan, Tatyana Khachatryan

Abstract:

The ability of the lactic acid bacteria (LAB) to prevent and cure a variety of diseases, their protective role against infections and colonization of pathogenic microorganisms in the digestive tract, has lead to the coining of the term probiotics or pro-life. LAB inhibiting the growth of pathogenic and food spoilage microorganisms, maintaining the nutritive quality and improving the shelf life of foods. They have also been used as flavor and texture producers. Enterococcus strains have been used for treatment of diseases such as diarrhea or antibiotic associated diarrhea, inflammatory pathologies that affect colon such as irritable bowel syndrome, or immune regulation, diarrhea caused by antibiotic treatments. The obtaining and investigation of biological properties of proteinoceous antibiotics, on the basis of probiotic LAB shown, that bacteriocins, metabiotics, and peptides of LAB represent bactericides have a broad range of activity and are excellent candidates for development of new prophylactic and therapeutic substances to complement or replace conventional antibiotics. The genotyping by 16S rRNA sequencing for LAB were used. Cell free culture broth (CFC) broth was purified by the Gel filtration method on the Sephadex Superfine G 25 resin. Antimicrobial activity was determined by spot-on-lawn method and expressed in arbitrary units (AU/ml). The diversity of multidrug-resistance (MDR) of pathogenic strains to antibiotics, most widely used for treatment of human diseases in the Republics of Armenia and Nagorno Karabakh were examined. It was shown, that difference of resistance of pathogens to antibiotics depends on their isolation sources. The influences of partially purified antimicrobial preparations (AMP), obtained from the different strains of Enterococcus genus on the growth of MDR pathogenic bacteria were investigated. It was shown, that bacteriocin containing partially purified preparations, obtained from different strains of Enterococcus faecium and durans species, possess bactericidal or bacteriostatic activity against antibiotic resistant intestinal, spoilage and food-borne pathogens such as Listeria monocytogenes, Staphylococcus aureus, E. coli, and Salmonella. Endemic strains of LAB, isolated from Matsoni made from donkey, buffalo and goat milk, shown broad spectrum of activity against food spoiling microorganisms, moulds and fungi, such as Salmonella sp., Esherichia coli, Aspergillus and Penicillium species. Highest activity against MDR pathogens shown bacteria, isolated from goat milk products. High stability of the investigated strains of the genus Enerococcus, isolated from samples of matsun from different regions of Nagorno-Karabakh (NKR) to the antibiotics was shown. The obtained data show high stability of the investigated different strains of the genus Enerococcus. The high genetic diversity in Enterococcus group suggests adaptations for specific mutations in different environments. Thus, endemic strains of LAB are able to produce bacteriocins with high and different inhibitory activity against broad spectrum of microorganisms isolated from different sources and belong to different taxonomic group. Prospect of the use of certain antimicrobial preparations against pathogenic strains is obvious. These AMP can be applied for long term use against different etiology antibiotic resistant pathogens for prevention or treatment of infectional diseases as an alternative to antibiotics.

Keywords: antimicrobial biopreparation, endemic lactic acid bacteria, intra-species diversity, multidrug resistance of pathogens

Procedia PDF Downloads 286
29 Discovering Causal Structure from Observations: The Relationships between Technophile Attitude, Users Value and Use Intention of Mobility Management Travel App

Authors: Aliasghar Mehdizadeh Dastjerdi, Francisco Camara Pereira

Abstract:

The increasing complexity and demand of transport services strains transportation systems especially in urban areas with limited possibilities for building new infrastructure. The solution to this challenge requires changes of travel behavior. One of the proposed means to induce such change is multimodal travel apps. This paper describes a study of the intention to use a real-time multi-modal travel app aimed at motivating travel behavior change in the Greater Copenhagen Region (Denmark) toward promoting sustainable transport options. The proposed app is a multi-faceted smartphone app including both travel information and persuasive strategies such as health and environmental feedback, tailoring travel options, self-monitoring, tunneling users toward green behavior, social networking, nudging and gamification elements. The prospective for mobility management travel apps to stimulate sustainable mobility rests not only on the original and proper employment of the behavior change strategies, but also on explicitly anchoring it on established theoretical constructs from behavioral theories. The theoretical foundation is important because it positively and significantly influences the effectiveness of the system. However, there is a gap in current knowledge regarding the study of mobility-management travel app with support in behavioral theories, which should be explored further. This study addresses this gap by a social cognitive theory‐based examination. However, compare to conventional method in technology adoption research, this study adopts a reverse approach in which the associations between theoretical constructs are explored by Max-Min Hill-Climbing (MMHC) algorithm as a hybrid causal discovery method. A technology-use preference survey was designed to collect data. The survey elicited different groups of variables including (1) three groups of user’s motives for using the app including gain motives (e.g., saving travel time and cost), hedonic motives (e.g., enjoyment) and normative motives (e.g., less travel-related CO2 production), (2) technology-related self-concepts (i.e. technophile attitude) and (3) use Intention of the travel app. The questionnaire items led to the formulation of causal relationships discovery to learn the causal structure of the data. Causal relationships discovery from observational data is a critical challenge and it has applications in different research fields. The estimated causal structure shows that the two constructs of gain motives and technophilia have a causal effect on adoption intention. Likewise, there is a causal relationship from technophilia to both gain and hedonic motives. In line with the findings of the prior studies, it highlights the importance of functional value of the travel app as well as technology self-concept as two important variables for adoption intention. Furthermore, the results indicate the effect of technophile attitude on developing gain and hedonic motives. The causal structure shows hierarchical associations between the three groups of user’s motive. They can be explained by “frustration-regression” principle according to Alderfer's ERG (Existence, Relatedness and Growth) theory of needs meaning that a higher level need remains unfulfilled, a person may regress to lower level needs that appear easier to satisfy. To conclude, this study shows the capability of causal discovery methods to learn the causal structure of theoretical model, and accordingly interpret established associations.

Keywords: travel app, behavior change, persuasive technology, travel information, causality

Procedia PDF Downloads 111
28 Creating a Critical Digital Pedagogy Context: Challenges and Potential of Designing and Implementing a Blended Learning Intervention for Adult Refugees in Greece

Authors: Roula Kitsiou, Sofia Tsioli, Eleni Gana

Abstract:

The current sociopolitical realities (displacement, encampment, and resettlement) refugees experience in Greece are a quite complex issue. Their educational and social ‘integration’ is characterized by transition, insecurity, and constantly changing needs. Based on the current research data, technology and more specifically mobile phones are one of the most important resources for refugees, regardless of their levels of conventional literacy. The proposed paper discusses the challenges encountered during the design and implementation of the educational Action 16 ‘Language Education for Adult Refugees’. Action 16 is one of the 24 Actions of the Project PRESS (Provision of Refugee Education and Support Scheme), funded by the Hellenic Open University (2016-2017). Project PRESS had two main objectives: a) to address the educational and integration needs of refugees in transit, who currently reside in Greece, and b) implement research-based educational interventions in online and offline sites. In the present paper, the focus is on reflection and discussion about the challenges and the potential of integrating technology in language learning for a target-group with many specific needs, which have been recorded in field notes among other research tools (ethnographic data) used in the context of PRESS. Action 16, explores if and how technology enhanced language activities in real-time and place mediated through teachers, as well as an autonomous computer-mediated learning space (moodle platform and application) builds on and expands the linguistic, cultural and digital resources and repertoires of the students by creating collaborative face-to-face and digital learning spaces. A broader view on language as a dynamic puzzle of semiotic resources and processes based on the concept of translanguaging is adopted. Specifically, designing the blended learning environment we draw on the construct of translanguaging a) as a symbolic means to valorize students’ repertoires and practices, b) as a method to reach to specific applications of a target-language that the context brings forward (Greek useful to them), and c) as a means to expand refugees’ repertoires. This has led to the creation of a learning space where students' linguistic and cultural resources can find paths to expression. In this context, communication and learning are realized by mutually investing multiple aspects of the team members' identities as educational material designers, teachers, and students on the teaching and learning processes. Therefore, creativity, humour, code-switching, translation, transference etc. are all possible means that can be employed in order to promote multilingual communication and language learning towards raising intercultural awareness in a critical digital pedagogy context. The qualitative analysis includes critical reflection on the developed educational material, team-based reflexive discussions, teachers’ reports data, and photographs from the interventions. The endeavor to involve women and men with a refugee background into a blended learning experience was quite innovative especially for the Greek context. It reflects a pragmatist ethos of the choices made in order to respond to the here-and-now needs of the refugees, and finally it was a very challenging task that has led all actors involved into Action 16 to (re)negotiations of subjectivities and products in a creative and hopeful way.

Keywords: blended learning, integration, language education, refugees

Procedia PDF Downloads 103
27 Continuity Through Best Practice. A Case Series of Complex Wounds Manage by Dedicated Orthopedic Nursing Team

Authors: Siti Rahayu, Khairulniza Mohd Puat, Kesavan R., Mohammad Harris A., Jalila, Kunalan G., Fazir Mohamad

Abstract:

The greatest challenge has been in establishing and maintaining the dedicated nursing team. Continuity is served when nurses are assigned exclusively for managing wound, where they can continue to build expertise and skills. In addition, there is a growing incidence of chronic wounds and recognition of the complexity involved in caring for these patients. We would like to share 4 cases with different techniques of wound management. 1st case, 39 years old gentleman with underlying rheumatoid arthritis with chronic periprosthetic joint infection of right total knee replacement presented with persistent drainage over right knee. Patient was consulted for two stage revision total knee replacement. However, patient only agreed for debridement and retention of implant. After debridement, large medial and lateral wound was treated with Instillation Negative Pressure Wound Therapy Dressings. After several cycle, the wound size reduced, and conventional dressing was applied. 2nd case, 58 years old gentleman with underlying diabetes presented with right foot necrotizing fasciitis with gangrene of 5th toe. He underwent extensive debridement of foot with rays’ amputation of 5th toe. Post debridement patient was started on Instillation Negative Pressure Wound Therapy Dressings. After several cycle of VAC, the wound bed was prepared, and he underwent split skin graft over right foot. 3 rd case, 60 years old gentleman with underlying diabetes mellitus presented with right foot necrotizing soft tissue infection. He underwent rays’ amputation and extensive wound debridement. Upon stabilization of general condition, patient was discharge with regular wound dressing by same nurse and doctor during each visit to clinic follow up. After 6 months of follow up, the wound healed well. 4th case, 38-year-old gentleman had alleged motor vehicle accident and sustained closed fracture right tibial plateau. Open reduction and proximal tibial locking plate were done. At 2 weeks post-surgery, the patient presented with warm, erythematous leg and pus discharge from the surgical site. Empirical antibiotic was started, and wound debridement was done. Intraoperatively, 50cc pus was evacuated, unhealthy muscle and tissue debrided. No loosening of the implant. Patient underwent multiple wound debridement. At 2 weeks post debridement wound healed well, but the proximal aspect was unable to close immediately. This left the proximal part of the implant to be exposed. Patient was then put on VAC dressing for 3 weeks until healthy granulation tissue closes the implant. Meanwhile, antibiotic was change according to culture and sensitivity. At 6 weeks post the first debridement, the wound was completely close, and patient was discharge home well. At 3 months post operatively, patient wound and fracture healed uneventfully and able to ambulate independently. Complex wounds are too serious to be dealt with. Team managing complex wound need continuous support through the provision of educational tools to support their professional development, engagement with local and international expert, as well as highquality products that increase efficiencies in services

Keywords: VAC (Vacuum Assisted Closure), empirical- initial antibiotics, NPWT- negative pressure wound therapy, NF- necrotizing fasciitis, gangrene- blackish discoloration due to poor blood supply

Procedia PDF Downloads 80
26 A Comparative Evaluation of Cognitive Load Management: Case Study of Postgraduate Business Students

Authors: Kavita Goel, Donald Winchester

Abstract:

In a world of information overload and work complexities, academics often struggle to create an online instructional environment enabling efficient and effective student learning. Research has established that students’ learning styles are different, some learn faster when taught using audio and visual methods. Attributes like prior knowledge and mental effort affect their learning. ‘Cognitive load theory’, opines learners have limited processing capacity. Cognitive load depends on the learner’s prior knowledge, the complexity of content and tasks, and instructional environment. Hence, the proper allocation of cognitive resources is critical for students’ learning. Consequently, a lecturer needs to understand the limits and strengths of the human learning processes, various learning styles of students, and accommodate these requirements while designing online assessments. As acknowledged in the cognitive load theory literature, visual and auditory explanations of worked examples potentially lead to a reduction of cognitive load (effort) and increased facilitation of learning when compared to conventional sequential text problem solving. This will help learner to utilize both subcomponents of their working memory. Instructional design changes were introduced at the case site for the delivery of the postgraduate business subjects. To make effective use of auditory and visual modalities, video recorded lectures, and key concept webinars were delivered to students. Videos were prepared to free up student limited working memory from irrelevant mental effort as all elements in a visual screening can be viewed simultaneously, processed quickly, and facilitates greater psychological processing efficiency. Most case study students in the postgraduate programs are adults, working full-time at higher management levels, and studying part-time. Their learning style and needs are different from other tertiary students. The purpose of the audio and visual interventions was to lower the students cognitive load and provide an online environment supportive to their efficient learning. These changes were expected to impact the student’s learning experience, their academic performance and retention favourably. This paper posits that these changes to instruction design facilitates students to integrate new knowledge into their long-term memory. A mixed methods case study methodology was used in this investigation. Primary data were collected from interviews and survey(s) of students and academics. Secondary data were collected from the organisation’s databases and reports. Some evidence was found that the academic performance of students does improve when new instructional design changes are introduced although not statistically significant. However, the overall grade distribution of student’s academic performance has changed and skewed higher which shows deeper understanding of the content. It was identified from feedback received from students that recorded webinars served as better learning aids than material with text alone, especially with more complex content. The recorded webinars on the subject content and assessments provides flexibility to students to access this material any time from repositories, many times, and this enhances students learning style. Visual and audio information enters student’s working memory more effectively. Also as each assessment included the application of the concepts, conceptual knowledge interacted with the pre-existing schema in the long-term memory and lowered student’s cognitive load.

Keywords: cognitive load theory, learning style, instructional environment, working memory

Procedia PDF Downloads 117
25 Utilization of Functionalized Biochar from Water Hyacinth (Eichhornia crassipes) as Green Nano-Fertilizers

Authors: Adewale Tolulope Irewale, Elias Emeka Elemike, Christian O. Dimkpa, Emeka Emmanuel Oguzie

Abstract:

As the global population steadily approaches the 10billion mark, the world is currently faced with two major challenges among others – accessing sustainable and clean energy, and food security. Accessing cleaner and sustainable energy sources to drive global economy and technological advancement, and feeding the teeming human population require sustainable, innovative, and smart solutions. To solve the food production problem, producers have relied on fertilizers as a way of improving crop productivity. Commercial inorganic fertilizers, which is employed to boost agricultural food production, however, pose significant ecological sustainability and economic problems including soil and water pollution, reduced input efficiency, development of highly resistant weeds, micronutrient deficiency, soil degradation, and increased soil toxicity. These ecological and sustainability concerns have raised uncertainties about the continued effectiveness of conventional fertilizers. With the application of nanotechnology, plant biomass upcycling offers several advantages in greener energy production and sustainable agriculture through reduction of environmental pollution, increasing soil microbial activity, recycling carbon thereby reducing GHG emission, and so forth. This innovative technology has the potential for a circular economy and creating a sustainable agricultural practice. Nanomaterials have the potential to greatly enhance the quality and nutrient composition of organic biomass which in turn, allows for the conversion of biomass into nanofertilizers that are potentially more efficient. Water hyacinth plant harvested from an inland water at Warri, Delta State Nigeria were air-dried and milled into powder form. The dry biomass were used to prepare biochar at a pre-determined temperature in an oxygen deficient atmosphere. Physicochemical analysis of the resulting biochar was carried out to determine its porosity and general morphology using the Scanning Transmission Electron Microscopy (STEM). The functional groups (-COOH, -OH, -NH2, -CN, -C=O) were assessed using the Fourier Transform InfraRed Spectroscopy (FTIR) while the heavy metals (Cr, Cu, Fe, Pb, Mg, Mn) were analyzed using Inductively Coupled Plasma – Optical Emission Spectrometry (ICP-OES). Impregnation of the biochar with nanonutrients were achieved under varied conditions of pH, temperature, nanonutrient concentrations and resident time to achieve optimum adsorption. Adsorption and desorption studies were carried out on the resulting nanofertilizer to determine kinetics for the potential nutrients’ bio-availability to plants when used as green fertilizers. Water hyacinth (Eichhornia crassipes) which is an aggressively invasive aquatic plant known for its rapid growth and profusion is being examined in this research to harness its biomass as a sustainable feedstock to formulate functionalized nano-biochar fertilizers, offering various benefits including water hyacinth biomass upcycling, improved nutrient delivery to crops and aquatic ecosystem remediation. Altogether, this work aims to create output values in the three dimensions of environmental, economic, and social benefits.

Keywords: biochar-based nanofertilizers, eichhornia crassipes, greener agriculture, sustainable ecosystem, water hyacinth

Procedia PDF Downloads 22
24 Tangible Losses, Intangible Traumas: Re-envisioning Recovery Following the Lytton Creek Fire 2021 through Place Attachment Lens

Authors: Tugba Altin

Abstract:

In an era marked by pronounced climate change consequences, communities are observed to confront traumatic events that yield both tangible and intangible repercussions. Such events not only cause discernible damage to the landscape but also deeply affect the intangible aspects, including emotional distress and disruptions to cultural landscapes. The Lytton Creek Fire of 2021 serves as a case in point. Beyond the visible destruction, the less overt but profoundly impactful disturbance to place attachment (PA) is scrutinized. PA, representing the emotional and cognitive bonds individuals establish with their environments, is crucial for understanding how such events impact cultural identity and connection to the land. The study underscores the significance of addressing both tangible and intangible traumas for holistic community recovery. As communities renegotiate their affiliations with altered environments, the cultural landscape emerges as instrumental in shaping place-based identities. This renewed understanding is pivotal for reshaping adaptation planning. The research advocates for adaptation strategies rooted in the lived experiences and testimonies of the affected populations. By incorporating both the tangible and intangible facets of trauma, planning efforts are suggested to be more culturally attuned and emotionally insightful, fostering true resonance with the affected communities. Through such a comprehensive lens, this study contributes enriching the climate change discourse, emphasizing the intertwined nature of tangible recovery and the imperative of emotional and cultural healing after environmental disasters. Following the pronounced aftermath of the Lytton Creek Fire in 2021, research aims to deeply understand its impact on place attachment (PA), encompassing the emotional and cognitive bonds individuals form with their environments. The interpretive phenomenological approach, enriched by a hermeneutic framework, is adopted, emphasizing the experiences of the Lytton community and co-researchers. Phenomenology informed the understanding of 'place' as the focal point of attachment, providing insights into its formation and evolution after traumatic events. Data collection departs from conventional methods. Instead of traditional interviews, walking audio sessions and photo elicitation methods are utilized. These allow co-researchers to immerse themselves in the environment, re-experience, and articulate memories and feelings in real-time. Walking audio facilitates reflections on spatial narratives post-trauma, while photo voices captured intangible emotions, enabling the visualization of place-based experiences. The analysis is collaborative, ensuring the co-researchers' experiences and interpretations are central. Emphasizing their agency in knowledge production, the process is rigorous, facilitated by the harmonious blend of interpretive phenomenology and hermeneutic insights. The findings underscore the need for adaptation and recovery efforts to address emotional traumas alongside tangible damages. By exploring PA post-disaster, the research not only fills a significant gap but advocates for an inclusive approach to community recovery. Furthermore, the participatory methodologies employed challenge traditional research paradigms, heralding potential shifts in qualitative research norms.

Keywords: wildfire recovery, place attachment, trauma recovery, cultural landscape, visual methodologies

Procedia PDF Downloads 37
23 Identification of a Panel of Epigenetic Biomarkers for Early Detection of Hepatocellular Carcinoma in Blood of Individuals with Liver Cirrhosis

Authors: Katarzyna Lubecka, Kirsty Flower, Megan Beetch, Lucinda Kurzava, Hannah Buvala, Samer Gawrieh, Suthat Liangpunsakul, Tracy Gonzalez, George McCabe, Naga Chalasani, James M. Flanagan, Barbara Stefanska

Abstract:

Hepatocellular carcinoma (HCC), the most prevalent type of primary liver cancer, is the second leading cause of cancer death worldwide. Late onset of clinical symptoms in HCC results in late diagnosis and poor disease outcome. Approximately 85% of individuals with HCC have underlying liver cirrhosis. However, not all cirrhotic patients develop cancer. Reliable early detection biomarkers that can distinguish cirrhotic patients who will develop cancer from those who will not are urgently needed and could increase the cure rate from 5% to 80%. We used Illumina-450K microarray to test whether blood DNA, an easily accessible source of DNA, bear site-specific changes in DNA methylation in response to HCC before diagnosis with conventional tools (pre-diagnostic). Top 11 differentially methylated sites were selected for validation by pyrosequencing. The diagnostic potential of the 11 pyrosequenced probes was tested in blood samples from a prospective cohort of cirrhotic patients. We identified 971 differentially methylated CpG sites in pre-diagnostic HCC cases as compared with healthy controls (P < 0.05, paired Wilcoxon test, ICC ≥ 0.5). Nearly 76% of differentially methylated CpG sites showed lower levels of methylation in cases vs. controls (P = 2.973E-11, Wilcoxon test). Classification of the CpG sites according to their location relative to CpG islands and transcription start site revealed that those hypomethylated loci are located in regulatory regions important for gene transcription such as CpG island shores, promoters, and 5’UTR at higher frequency than hypermethylated sites. Among 735 CpG sites hypomethylated in cases vs. controls, 482 sites were assigned to gene coding regions whereas 236 hypermethylated sites corresponded to 160 genes. Bioinformatics analysis using GO, KEGG and DAVID knowledgebase indicate that differentially methylated CpG sites are located in genes associated with functions that are essential for gene transcription, cell adhesion, cell migration, and regulation of signal transduction pathways. Taking into account the magnitude of the difference, statistical significance, location, and consistency across the majority of matched pairs case-control, we selected 11 CpG loci corresponding to 10 genes for further validation by pyrosequencing. We established that methylation of CpG sites within 5 out of those 10 genes distinguish cirrhotic patients who subsequently developed HCC from those who stayed cancer free (cirrhotic controls), demonstrating potential as biomarkers of early detection in populations at risk. The best predictive value was detected for CpGs located within BARD1 (AUC=0.70, asymptotic significance ˂0.01). Using an additive logistic regression model, we further showed that 9 CpG loci within those 5 genes, that were covered in pyrosequenced probes, constitute a panel with high diagnostic accuracy (AUC=0.887; 95% CI:0.80-0.98). The panel was able to distinguish pre-diagnostic cases from cirrhotic controls free of cancer with 88% sensitivity at 70% specificity. Using blood as a minimally invasive material and pyrosequencing as a straightforward quantitative method, the established biomarker panel has high potential to be developed into a routine clinical test after validation in larger cohorts. This study was supported by Showalter Trust, American Cancer Society (IRG#14-190-56), and Purdue Center for Cancer Research (P30 CA023168) granted to BS.

Keywords: biomarker, DNA methylation, early detection, hepatocellular carcinoma

Procedia PDF Downloads 266