Search results for: acceleration of plantation development
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16677

Search results for: acceleration of plantation development

27 A Low-Cost Disposable PDMS Microfluidic Cartridge with Reagent Storage Silicone Blisters for Isothermal DNA Amplification

Authors: L. Ereku, R. E. Mackay, A. Naveenathayalan, K. Ajayi, W. Balachandran

Abstract:

Over the past decade the increase of sexually transmitted infections (STIs) especially in the developing world due to high cost and lack of sufficient medical testing have given rise to the need for a rapid, low cost point of care medical diagnostic that is disposable and most significantly reproduces equivocal results achieved within centralised laboratories. This paper present the development of a disposable PDMS microfluidic cartridge incorporating blisters filled with reagents required for isothermal DNA amplification in clinical diagnostics and point-of-care testing. In view of circumventing the necessity for external complex microfluidic pumps, designing on-chip pressurised fluid reservoirs is embraced using finger actuation and blister storage. The fabrication of the blisters takes into consideration three proponents that include: material characteristics, fluid volume and structural design. Silicone rubber is the chosen material due to its good chemical stability, considerable tear resistance and moderate tension/compression strength. The case of fluid capacity and structural form go hand in hand as the reagent need for the experimental analysis determines the volume size of the blisters, whereas the structural form has to be designed to provide low compression stress when deformed for fluid expulsion. Furthermore, the top and bottom section of the blisters are embedded with miniature polar opposite magnets at a defined parallel distance. These magnets are needed to lock or restrain the blisters when fully compressed so as to prevent unneeded backflow as a result of elasticity. The integrated chip is bonded onto a large microscope glass slide (50mm x 75mm). Each part is manufactured using a 3D printed mould designed using Solidworks software. Die-casting is employed, using 3D printed moulds, to form the deformable blisters by forcing a proprietary liquid silicone rubber through the positive mould cavity. The set silicone rubber is removed from the cast and prefilled with liquid reagent and then sealed with a thin (0.3mm) burstable layer of recast silicone rubber. The main microfluidic cartridge is fabricated using classical soft lithographic techniques. The cartridge incorporates microchannel circuitry, mixing chamber, inlet port, outlet port, reaction chamber and waste chamber. Polydimethylsiloxane (PDMS, QSil 216) is mixed and degassed using a centrifuge (ratio 10:1) is then poured after the prefilled blisters are correctly positioned on the negative mould. Heat treatment of about 50C to 60C in the oven for about 3hours is needed to achieve curing. The latter chip production stage involves bonding the cured PDMS to the glass slide. A plasma coroner treater device BD20-AC (Electro-Technic Products Inc., US) is used to activate the PDMS and glass slide before they are both joined and adequately compressed together, then left in the oven over the night to ensure bonding. There are two blisters in total needed for experimentation; the first will be used as a wash buffer to remove any remaining cell debris and unbound DNA while the second will contain 100uL amplification reagents. This paper will present results of chemical cell lysis, extraction using a biopolymer paper membrane and isothermal amplification on a low-cost platform using the finger actuated blisters for reagent storage. The platform has been shown to detect 1x105 copies of Chlamydia trachomatis using Recombinase Polymerase Amplification (RPA).

Keywords: finger actuation, point of care, reagent storage, silicone blisters

Procedia PDF Downloads 367
26 Turn Organic Waste to Green Fuels with Zero Landfill

Authors: Xu Fei (Philip) WU

Abstract:

As waste recycling concept been accepted more and more in modern societies, the organic portion of the municipal waste become a sires issue in today’s life. Depend on location and season, the organic waste can bee anywhere between 40-65% of total municipal solid waste. Also composting and anaerobic digestion technologies been applied in this field for years, however both process have difficulties been selected by economical and environmental factors. Beside environmental pollution and risk of virus spread, the compost is not a product been welcomed by people even the waste management has to give up them at no cost. The anaerobic digester has to have 70% of water and keep at 35 degree C or above; base on above conditions, the retention time only can be up to two weeks and remain solid has to be dewater and composting again. The enhancive waste water treatment has to be added after. Because these reasons, the voice of suggesting cancelling recycling program and turning all waste to mass burn incinerations have been raised-A process has already been proved has least energy efficiency and most air pollution problem associated process. A newly developed WXF Bio-energy process employs recently developed and patented pre-designed separation, multi-layer and multi-cavity successive bioreactor landfill technology. It features an improved leachate recycling technology, technologies to maximize the biogas generation rate and a reduced overall turnaround period on the land. A single properly designed and operated site can be used indefinitely. In this process, all collected biogas will be processed to eliminate H2S and other hazardous gases. The methane, carbon dioxide and hydrogen will be utilized in a proprietary process to manufacture methanol which can be sold to mitigate operating costs of the landfill. This integration of new processes offers a more advanced alternative to current sanitary landfill, incineration and compost technology. Xu Fei (Philip) Wu Xu Fei Wu is founder and Chief Scientist of W&Y Environmental International Inc. (W & Y), a Canadian environmental and sustainable energy technology company with patented landfill processes and proprietary waste to energy technologies. He has worked in environmental and sustainable energy fields over the last 25 years. Before W&Y, he worked for Conestoga-Rovers & Associates Limited, Microbe Environmental Science and Technology Inc. of Canada and The Ministry of Nuclear Industry and Ministry of Space Flight Industry of China. Xu Fei Wu holds a Master of Engineering Science degree from The University of Western Ontario. I wish present this paper as an oral presentation only Selected Conference Presentations: • “Removal of Phenolic Compounds with Algae” Presented at 25th Canadian Symposium on Water Pollution Research (CAWPRC Conference), Burlington, Ontario Canada. February, 1990 • “Removal of Phenolic Compounds with Algae” Presented at Annual Conference of Pollution Control Association of Ontario, London, Ontario, Canada. April, 1990 • “Removal of Organochlorine Compounds in a Flocculated Algae Photo-Bioreactor” Presented at International Symposium on Low Cost and Energy Saving Wastewater Treatment Technologies (IAWPRC Conference), Kiyoto, Japan, August, 1990 • “Maximizing Production and Utilization of Landfill Gas” 2009 Wuhan International Conference on Environment(CAWPRC Conference, sponsored by US EPA) Wuhan, China. October, 2009. • “WXF Bio-Energy-A Green, Sustainable Waste to Energy Process” Presented at 9Th International Conference Cooperation for Waste Issues, Kharkiv, Ukraine March, 2012 • “A Lannfill Site Can Be Recycled Indefinitely” Presented at 28th International Conference on solid Waste Technology and Management, Philadelphia, Pennsylvania, USA. March, 2013. Hosted by The Journal of Solid Waste Technology and Management.

Keywords: green fuel, waste management, bio-energy, sustainable development, methanol

Procedia PDF Downloads 276
25 From Linear to Circular Model: An Artificial Intelligence-Powered Approach in Fosso Imperatore

Authors: Carlotta D’Alessandro, Giuseppe Ioppolo, Katarzyna Szopik-Depczyńska

Abstract:

— The growing scarcity of resources and the mounting pressures of climate change, water pollution, and chemical contamination have prompted societies, governments, and businesses to seek ways to minimize their environmental impact. To combat climate change, and foster sustainability, Industrial Symbiosis (IS) offers a powerful approach, facilitating the shift toward a circular economic model. IS has gained prominence in the European Union's policy framework as crucial enabler of resource efficiency and circular economy practices. The essence of IS lies in the collaborative sharing of resources such as energy, material by-products, waste, and water, thanks to geographic proximity. It can be exemplified by eco-industrial parks (EIPs), which are natural environments for boosting cooperation and resource sharing between businesses. EIPs are characterized by group of businesses situated in proximity, connected by a network of both cooperative and competitive interactions. They represent a sustainable industrial model aimed at reducing resource use, waste, and environmental impact while fostering economic and social wellbeing. IS, combined with Artificial Intelligence (AI)-driven technologies, can further optimize resource sharing and efficiency within EIPs. This research, supported by the “CE_IPs” project, aims to analyze the potential for IS and AI, in advancing circularity and sustainability at Fosso Imperatore. The Fosso Imperatore Industrial Park in Nocera Inferiore, Italy, specializes in agriculture and the industrial transformation of agricultural products, particularly tomatoes, tobacco, and textile fibers. This unique industrial cluster, centered around tomato cultivation and processing, also includes mechanical engineering enterprises and agricultural packaging firms. To stimulate the shift from a traditional to a circular economic model, an AI-powered Local Development Plan (LDP) is developed for Fosso Imperatore. It can leverage data analytics, predictive modeling, and stakeholder engagement to optimize resource utilization, reduce waste, and promote sustainable industrial practices. A comprehensive SWOT analysis of the AI-powered LDP revealed several key factors influencing its potential success and challenges. Among the notable strengths and opportunities arising from AI implementation are reduced processing times, fewer human errors, and increased revenue generation. Furthermore, predictive analytics minimize downtime, bolster productivity, and elevate quality while mitigating workplace hazards. However, the integration of AI also presents potential weaknesses and threats, including significant financial investment, since implementing and maintaining AI systems can be costly. The widespread adoption of AI could lead to job losses in certain sectors. Lastly, AI systems are susceptible to cyberattacks, posing risks to data security and operational continuity. Moreover, an Analytic Hierarchy Process (AHP) analysis was employed to yield a prioritized ranking of the outlined AI-driven LDP practices based on the stakeholder input, ensuring a more comprehensive and representative understanding of their relative significance for achieving sustainability in Fosso Imperatore Industrial Park. While this study provides valuable insights into the potential of AIpowered LDP at the Fosso Imperatore, it is important to note that the findings may not be directly applicable to all industrial parks, particularly those with different sizes, geographic locations, or industry compositions. Additional study is necessary to scrutinize the generalizability of these results and to identify best practices for implementing AI-driven LDP in diverse contexts.

Keywords: artificial intelligence, climate change, Fosso Imperatore, industrial park, industrial symbiosis

Procedia PDF Downloads 23
24 Structural Characteristics of HPDSP Concrete on Beam Column Joints

Authors: Hari Krishan Sharma, Sanjay Kumar Sharma, Sushil Kumar Swar

Abstract:

Inadequate transverse reinforcement is considered as the main reason for the beam column joint shear failure observed during recent earthquakes. DSP matrix consists of cement and high content of micro-silica with low water to cement ratio while the aggregates are graded quartz sand. The use of reinforcing fibres leads not only to the increase of tensile/bending strength and specific fracture energy, but also to reduction of brittleness and, consequently, to production of non-explosive ruptures. Besides, fibre-reinforced materials are more homogeneous and less sensitive to small defects and flaws. Recent works on the freeze-thaw durability (also in the presence of de-icing salts) of fibre-reinforced DSP confirm the excellent behaviour in the expected long term service life.DSP materials, including fibre-reinforced DSP and CRC (Compact Reinforced Composites) are obtained by using high quantities of super plasticizers and high volumes of micro-silica. Steel fibres with high tensile yield strength of smaller diameter and short length in different fibre volume percentage and aspect ratio tilized to improve the performance by reducing the brittleness of matrix material. In the case of High Performance Densified Small Particle Concrete (HPDSPC), concrete is dense at the micro-structure level, tensile strain would be much higher than that of the conventional SFRC, SIFCON & SIMCON. Beam-column sub-assemblages used as moment resisting constructed using HPDSPC in the joint region with varying quantities of steel fibres, fibre aspect ratio and fibre orientation in the critical section. These HPDSPC in the joint region sub-assemblages tested under cyclic/earthquake loading. Besides loading measurements, frame displacements, diagonal joint strain and rebar strain adjacent to the joint will also be measured to investigate stress-strain behaviour, load deformation characteristics, joint shear strength, failure mechanism, ductility associated parameters, stiffness and energy dissipated parameters of the beam column sub-assemblages also evaluated. Finally a design procedure for the optimum design of HPDSPC corresponding to moment, shear forces and axial forces for the reinforced concrete beam-column joint sub-assemblage proposed. The fact that the implementation of material brittleness measure in the design of RC structures can improve structural reliability by providing uniform safety margins over a wide range of structural sizes and material compositions well recognized in the structural design and research. This lead to the development of high performance concrete for the optimized combination of various structural ratios in concrete for the optimized combination of various structural properties. The structural applications of HPDSPC, because of extremely high strength, will reduce dead load significantly as compared to normal weight concrete thereby offering substantial cost saving and by providing improved seismic response, longer spans, and thinner sections, less reinforcing steel and lower foundation cost. These cost effective parameters will make this material more versatile for use in various structural applications like beam-column joints in industries, airports, parking areas, docks, harbours, and also containers for hazardous material, safety boxes and mould & tools for polymer composites and metals.

Keywords: high performance densified small particle concrete (HPDSPC), steel fibre reinforced concrete (SFRC), slurry infiltrated concrete (SIFCON), Slurry infiltrated mat concrete (SIMCON)

Procedia PDF Downloads 301
23 Ultra-Rapid and Efficient Immunomagnetic Separation of Listeria Monocytogenes from Complex Samples in High-Gradient Magnetic Field Using Disposable Magnetic Microfluidic Device

Authors: L. Malic, X. Zhang, D. Brassard, L. Clime, J. Daoud, C. Luebbert, V. Barrere, A. Boutin, S. Bidawid, N. Corneau, J. Farber, T. Veres

Abstract:

The incidence of infections caused by foodborne pathogens such as Listeria monocytogenes (L. monocytogenes) poses a great potential threat to public health and safety. These issues are further exacerbated by legal repercussions due to “zero tolerance” food safety standards adopted in developed countries. Unfortunately, a large number of related disease outbreaks are caused by pathogens present in extremely low counts currently undetectable by available techniques. The development of highly sensitive and rapid detection of foodborne pathogens is therefore crucial, and requires robust and efficient pre-analytical sample preparation. Immunomagnetic separation is a popular approach to sample preparation. Microfluidic chips combined with external magnets have emerged as viable high throughput methods. However, external magnets alone are not suitable for the capture of nanoparticles, as very strong magnetic fields are required. Devices that incorporate externally applied magnetic field and microstructures of a soft magnetic material have thus been used for local field amplification. Unfortunately, very complex and costly fabrication processes used for integration of soft magnetic materials in the reported proof-of-concept devices would prohibit their use as disposable tools for food and water safety or diagnostic applications. We present a sample preparation magnetic microfluidic device implemented in low-cost thermoplastic polymers using fabrication techniques suitable for mass-production. The developed magnetic capture chip (M-chip) was employed for rapid capture and release of L. monocytogenes conjugated to immunomagnetic nanoparticles (IMNs) in buffer and beef filtrate. The M-chip relies on a dense array of Nickel-coated high-aspect ratio pillars for capture with controlled magnetic field distribution and a microfluidic channel network for sample delivery, waste, wash and recovery. The developed Nickel-coating process and passivation allows generation of switchable local perturbations within the uniform magnetic field generated with a pair of permanent magnets placed at the opposite edges of the chip. This leads to strong and reversible trapping force, wherein high local magnetic field gradients allow efficient capture of IMNs conjugated to L. monocytogenes flowing through the microfluidic chamber. The experimental optimization of the M-chip was performed using commercially available magnetic microparticles and fabricated silica-coated iron-oxide nanoparticles. The fabricated nanoparticles were optimized to achieve the desired magnetic moment and surface functionalization was tailored to allow efficient capture antibody immobilization. The integration, validation and further optimization of the capture and release protocol is demonstrated using both, dead and live L. monocytogenes through fluorescence microscopy and plate- culture method. The capture efficiency of the chip was found to vary as function of listeria to nanoparticle concentration ratio. The maximum capture efficiency of 30% was obtained and the 24-hour plate-culture method allowed the detection of initial sample concentration of only 16 cfu/ml. The device was also very efficient in concentrating the sample from a 10 ml initial volume. Specifically, 280% concentration efficiency was achieved in 17 minutes only, demonstrating the suitability of the system for food safety applications. In addition, flexible design and low-cost fabrication process will allow rapid sample preparation for applications beyond food and water safety, including point-of-care diagnosis.

Keywords: array of pillars, bacteria isolation, immunomagnetic sample preparation, polymer microfluidic device

Procedia PDF Downloads 279
22 Assessing Diagnostic and Evaluation Tools for Use in Urban Immunisation Programming: A Critical Narrative Review and Proposed Framework

Authors: Tim Crocker-Buque, Sandra Mounier-Jack, Natasha Howard

Abstract:

Background: Due to both the increasing scale and speed of urbanisation, urban areas in low and middle-income countries (LMICs) host increasingly large populations of under-immunized children, with the additional associated risks of rapid disease transmission in high-density living environments. Multiple interdependent factors are associated with these coverage disparities in urban areas and most evidence comes from relatively few countries, e.g., predominantly India, Kenya, Nigeria, and some from Pakistan, Iran, and Brazil. This study aimed to identify, describe, and assess the main tools used to measure or improve coverage of immunisation services in poor urban areas. Methods: Authors used a qualitative review design, including academic and non-academic literature, to identify tools used to improve coverage of public health interventions in urban areas. Authors selected and extracted sources that provided good examples of specific tools, or categories of tools, used in a context relevant to urban immunization. Diagnostic (e.g., for data collection, analysis, and insight generation) and programme tools (e.g., for investigating or improving ongoing programmes) and interventions (e.g., multi-component or stand-alone with evidence) were selected for inclusion to provide a range of type and availability of relevant tools. These were then prioritised using a decision-analysis framework and a tool selection guide for programme managers developed. Results: Authors reviewed tools used in urban immunisation contexts and tools designed for (i) non-immunization and/or non-health interventions in urban areas, and (ii) immunisation in rural contexts that had relevance for urban areas (e.g., Reaching every District/Child/ Zone). Many approaches combined several tools and methods, which authors categorised as diagnostic, programme, and intervention. The most common diagnostic tools were cross-sectional surveys, key informant interviews, focus group discussions, secondary analysis of routine data, and geographical mapping of outcomes, resources, and services. Programme tools involved multiple stages of data collection, analysis, insight generation, and intervention planning and included guidance documents from WHO (World Health Organisation), UNICEF (United Nations Children's Fund), USAID (United States Agency for International Development), and governments, and articles reporting on diagnostics, interventions, and/or evaluations to improve urban immunisation. Interventions involved service improvement, education, reminder/recall, incentives, outreach, mass-media, or were multi-component. The main gaps in existing tools were an assessment of macro/policy-level factors, exploration of effective immunization communication channels, and measuring in/out-migration. The proposed framework uses a problem tree approach to suggest tools to address five common challenges (i.e. identifying populations, understanding communities, issues with service access and use, improving services, improving coverage) based on context and available data. Conclusion: This study identified many tools relevant to evaluating urban LMIC immunisation programmes, including significant crossover between tools. This was encouraging in terms of supporting the identification of common areas, but problematic as data volumes, instructions, and activities could overwhelm managers and tools are not always suitably applied to suitable contexts. Further research is needed on how best to combine tools and methods to suit local contexts. Authors’ initial framework can be tested and developed further.

Keywords: health equity, immunisation, low and middle-income countries, poverty, urban health

Procedia PDF Downloads 139
21 Blockchain Based Hydrogen Market (BBH₂): A Paradigm-Shifting Innovative Solution for Climate-Friendly and Sustainable Structural Change

Authors: Volker Wannack

Abstract:

Regional, national, and international strategies focusing on hydrogen (H₂) and blockchain are driving significant advancements in hydrogen and blockchain technology worldwide. These strategies lay the foundation for the groundbreaking "Blockchain Based Hydrogen Market (BBH₂)" project. The primary goal of this project is to develop a functional Blockchain Minimum Viable Product (B-MVP) for the hydrogen market. The B-MVP will leverage blockchain as an enabling technology with a common database and platform, facilitating secure and automated transactions through smart contracts. This innovation will revolutionize logistics, trading, and transactions within the hydrogen market. The B-MVP has transformative potential across various sectors. It benefits renewable energy producers, surplus energy-based hydrogen producers, hydrogen transport and distribution grid operators, and hydrogen consumers. By implementing standardized, automated, and tamper-proof processes, the B-MVP enhances cost efficiency and enables transparent and traceable transactions. Its key objective is to establish the verifiable integrity of climate-friendly "green" hydrogen by tracing its supply chain from renewable energy producers to end users. This emphasis on transparency and accountability promotes economic, ecological, and social sustainability while fostering a secure and transparent market environment. A notable feature of the B-MVP is its cross-border operability, eliminating the need for country-specific data storage and expanding its global applicability. This flexibility not only broadens its reach but also creates opportunities for long-term job creation through the establishment of a dedicated blockchain operating company. By attracting skilled workers and supporting their training, the B-MVP strengthens the workforce in the growing hydrogen sector. Moreover, it drives the emergence of innovative business models that attract additional company establishments and startups and contributes to long-term job creation. For instance, data evaluation can be utilized to develop customized tariffs and provide demand-oriented network capacities to producers and network operators, benefitting redistributors and end customers with tamper-proof pricing options. The B-MVP not only brings technological and economic advancements but also enhances the visibility of national and international standard-setting efforts. Regions implementing the B-MVP become pioneers in climate-friendly, sustainable, and forward-thinking practices, generating interest beyond their geographic boundaries. Additionally, the B-MVP serves as a catalyst for research and development, facilitating knowledge transfer between universities and companies. This collaborative environment fosters scientific progress, aligns with strategic innovation management, and cultivates an innovation culture within the hydrogen market. Through the integration of blockchain and hydrogen technologies, the B-MVP promotes holistic innovation and contributes to a sustainable future in the hydrogen industry. The implementation process involves evaluating and mapping suitable blockchain technology and architecture, developing and implementing the blockchain, smart contracts, and depositing certificates of origin. It also includes creating interfaces to existing systems such as nomination, portfolio management, trading, and billing systems, testing the scalability of the B-MVP to other markets and user groups, developing data formats for process-relevant data exchange, and conducting field studies to validate the B-MVP. BBH₂ is part of the "Technology Offensive Hydrogen" funding call within the research funding of the Federal Ministry of Economics and Climate Protection in the 7th Energy Research Programme of the Federal Government.

Keywords: hydrogen, blockchain, sustainability, innovation, structural change

Procedia PDF Downloads 167
20 Developing a Place-Name Gazetteer for Singapore by Mining Historical Planning Archives and Selective Crowd-Sourcing

Authors: Kevin F. Hsu, Alvin Chua, Sarah X. Lin

Abstract:

As a multilingual society, Singaporean names for different parts of the city have changed over time. Residents included Indigenous Malays, dialect-speakers from China, European settler-colonists, and Tamil-speakers from South India. Each group would name locations in their own languages. Today, as ancestral tongues are increasingly supplanted by English, contemporary Singaporeans’ understanding of once-common place names is disappearing. After demolition or redevelopment, some urban places will only exist in archival records or in human memory. United Nations conferences on the standardization of geographic names have called attention to how place names relate to identity, well-being, and a sense of belonging. The Singapore Place-Naming Project responds to these imperatives by capturing past and present place names through digitizing historical maps, mining archival records, and applying selective crowd-sourcing to trace the evolution of place names throughout the city. The project ensures that both formal and vernacular geographical names remain accessible to historians, city planners, and the public. The project is compiling a gazetteer, a geospatial archive of placenames, with streets, buildings, landmarks, and other points of interest (POI) appearing in the historic maps and planning documents of Singapore, currently held by the National Archives of Singapore, the National Library Board, university departments, and the Urban Redevelopment Authority. To create a spatial layer of information, the project links each place name to either a geo-referenced point, line segment, or polygon, along with the original source material in which the name appears. This record is supplemented by crowd-sourced contributions from civil service officers and heritage specialists, drawing from their collective memory to (1) define geospatial boundaries of historic places that appear in past documents, but maybe unfamiliar to users today, and (2) identify and record vernacular place names not captured in formal planning documents. An intuitive interface allows participants to demarcate feature classes, vernacular phrasings, time periods, and other knowledge related to historical or forgotten spaces. Participants are stratified into age bands and ethnicity to improve representativeness. Future iterations could allow additional public contributions. Names reveal meanings that communities assign to each place. While existing historical maps of Singapore allow users to toggle between present-day and historical raster files, this project goes a step further by adding layers of social understanding and planning documents. Tracking place names illuminates linguistic, cultural, commercial, and demographic shifts in Singapore, in the context of transformations of the urban environment. The project also demonstrates how a moderated, selectively crowd-sourced effort can solicit useful geospatial data at scale, sourced from different generations, and at higher granularity than traditional surveys, while mitigating negative impacts of unmoderated crowd-sourcing. Stakeholder agencies believe the project will achieve several objectives, including Supporting heritage conservation and public education; Safeguarding intangible cultural heritage; Providing historical context for street, place or development-renaming requests; Enhancing place-making with deeper historical knowledge; Facilitating emergency and social services by tagging legal addresses to vernacular place names; Encouraging public engagement with heritage by eliciting multi-stakeholder input.

Keywords: collective memory, crowd-sourced, digital heritage, geospatial, geographical names, linguistic heritage, place-naming, Singapore, Southeast Asia

Procedia PDF Downloads 127
19 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data

Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard

Abstract:

Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.

Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset

Procedia PDF Downloads 2
18 Full Characterization of Heterogeneous Antibody Samples under Denaturing and Native Conditions on a Hybrid Quadrupole-Orbitrap Mass Spectrometer

Authors: Rowan Moore, Kai Scheffler, Eugen Damoc, Jennifer Sutton, Aaron Bailey, Stephane Houel, Simon Cubbon, Jonathan Josephs

Abstract:

Purpose: MS analysis of monoclonal antibodies (mAbs) at the protein and peptide levels is critical during development and production of biopharmaceuticals. The compositions of current generation therapeutic proteins are often complex due to various modifications which may affect efficacy. Intact proteins analyzed by MS are detected in higher charge states that also provide more complexity in mass spectra. Protein analysis in native or native-like conditions with zero or minimal organic solvent and neutral or weakly acidic pH decreases charge state value resulting in mAb detection at higher m/z ranges with more spatial resolution. Methods: Three commercially available mAbs were used for all experiments. Intact proteins were desalted online using size exclusion chromatography (SEC) or reversed phase chromatography coupled on-line with a mass spectrometer. For streamlined use of the LC- MS platform we used a single SEC column and alternately selected specific mobile phases to perform separations in either denaturing or native-like conditions: buffer A (20 % ACN, 0.1 % FA) with Buffer B (100 mM ammonium acetate). For peptide analysis mAbs were proteolytically digested with and without prior reduction and alkylation. The mass spectrometer used for all experiments was a commercially available Thermo Scientific™ hybrid Quadrupole-Orbitrap™ mass spectrometer, equipped with the new BioPharma option which includes a new High Mass Range (HMR) mode that allows for improved high mass transmission and mass detection up to 8000 m/z. Results: We have analyzed the profiles of three mAbs under reducing and native conditions by direct infusion with offline desalting and with on-line desalting via size exclusion and reversed phase type columns. The presence of high salt under denaturing conditions was found to influence the observed charge state envelope and impact mass accuracy after spectral deconvolution. The significantly lower charge states observed under native conditions improves the spatial resolution of protein signals and has significant benefits for the analysis of antibody mixtures, e.g. lysine variants, degradants or sequence variants. This type of analysis requires the detection of masses beyond the standard mass range ranging up to 6000 m/z requiring the extended capabilities available in the new HMR mode. We have compared each antibody sample that was analyzed individually with mixtures in various relative concentrations. For this type of analysis, we observed that apparent native structures persist and ESI is benefited by the addition of low amounts of acetonitrile and formic acid in combination with the ammonium acetate-buffered mobile phase. For analyses on the peptide level we analyzed reduced/alkylated, and non-reduced proteolytic digests of the individual antibodies separated via reversed phase chromatography aiming to retrieve as much information as possible regarding sequence coverage, disulfide bridges, post-translational modifications such as various glycans, sequence variants, and their relative quantification. All data acquired were submitted to a single software package for analysis aiming to obtain a complete picture of the molecules analyzed. Here we demonstrate the capabilities of the mass spectrometer to fully characterize homogeneous and heterogeneous therapeutic proteins on one single platform. Conclusion: Full characterization of heterogeneous intact protein mixtures by improved mass separation on a quadrupole-Orbitrap™ mass spectrometer with extended capabilities has been demonstrated.

Keywords: disulfide bond analysis, intact analysis, native analysis, mass spectrometry, monoclonal antibodies, peptide mapping, post-translational modifications, sequence variants, size exclusion chromatography, therapeutic protein analysis, UHPLC

Procedia PDF Downloads 360
17 Achieving Sustainable Lifestyles Based on the Spiritual Teaching and Values of Buddhism from Lumbini, Nepal

Authors: Purna Prasad Acharya, Madhav Karki, Sunta B. Tamang, Uttam Basnet, Chhatra Katwal

Abstract:

The paper outlines the idea behind achieving sustainable lifestyles based on the spiritual values and teachings of Lord Buddha. This objective is to be achieved by spreading the tenets and teachings of Buddhism throughout the Asia Pacific region and the world from the sacred birth place of Buddha - Lumbini, Nepal. There is an urgent need to advance the relevance of Buddhist philosophy in tackling the triple planetary crisis of climate change, nature’s decline, and pollution. Today, the world is facing an existential crisis due to the above crises, exasperated by hunger, poverty and armed conflict. To address multi-dimensional impacts, the global communities have to adopt simple life styles that respect nature and universal human values. These were the basic teachings of Gautam Buddha. Lumbini, Nepal has the moral obligation to widely disseminate Buddha’s teaching to the world and receive constant feedback and learning to develop human and ecosystem resilience by molding the lifestyles of current and future generations through adaptive learning and simplicity across the geography and nationality based on spirituality and environmental stewardship. By promoting Buddhism, Nepal has developed a pro-nature tourism industry that focuses on both its spiritual and bio-cultural heritage. Nepal is a country rich in ancient wisdom, where sages have sought knowledge, practiced meditation, and followed spiritual paths for thousands of years. It can spread the teachings of Buddha in a way people can search for and adopt ways to live, creating harmony with nature. Using tools of natural sciences and social sciences, the team will package knowledge and share the idea of community well-being within the framework of environmental sustainability, social harmony and universal respect for nature and people in a more holistic manner. This notion takes into account key elements of sustainable development such as food-energy-water-biodiversity interconnections, environmental conservation, ecological integrity, ecosystem health, community resiliency, adaptation capacity, and indigenous culture, knowledge and values. This inclusive concept has garnered a strong network of supporters locally, regionally, and internationally. The key objectives behind this concept are: a) to leverage expertise and passion of a network of global collaborators to advance research, education, and policy outreach in the areas of human sustainability based on lifestyle change using the power of spirituality and Buddha’s teaching, resilient lifestyles, and adaptive living; b) help develop creative short courses for multi-disciplinary teaching in educational institutions worldwide in collaboration with Lumbini Buddha University and other relevant partners in Nepal; c) help build local and regional intellectual and cultural teaching and learning capacity by improving professional collaborations to promote nature based and Buddhist value-based lifestyles by connecting Lumbini to Nepal’s rich nature; d) promote research avenues to provide policy relevant knowledge that is creative, innovative, as well as practical and locally viable; and e) connect local research and outreach work with academic and cultural partners in South Korea so as to open up Lumbini based Buddhist heritage and Nepal’s Karnali River basin’s unique natural landscape to Korean scholars and students to promote sustainable lifestyles leading to human living in harmony with nature.

Keywords: triple planetary crisis, spirituality, sustainable lifestyles, living in harmony with nature, resilience

Procedia PDF Downloads 32
16 An Integrated Multisensor/Modeling Approach Addressing Climate Related Extreme Events

Authors: H. M. El-Askary, S. A. Abd El-Mawla, M. Allali, M. M. El-Hattab, M. El-Raey, A. M. Farahat, M. Kafatos, S. Nickovic, S. K. Park, A. K. Prasad, C. Rakovski, W. Sprigg, D. Struppa, A. Vukovic

Abstract:

A clear distinction between weather and climate is a necessity because while they are closely related, there are still important differences. Climate change is identified when we compute the statistics of the observed changes in weather over space and time. In this work we will show how the changing climate contribute to the frequency, magnitude and extent of different extreme events using a multi sensor approach with some synergistic modeling activities. We are exploring satellite observations of dust over North Africa, Gulf Region and the Indo Gangetic basin as well as dust versus anthropogenic pollution events over the Delta region in Egypt and Seoul through remote sensing and utilize the behavior of the dust and haze on the aerosol optical properties. Dust impact on the retreat of the glaciers in the Himalayas is also presented. In this study we also focus on the identification and monitoring of a massive dust plume that blew off the western coast of Africa towards the Atlantic on October 8th, 2012 right before the development of Hurricane Sandy. There is evidence that dust aerosols played a non-trivial role in the cyclogenesis process of Sandy. Moreover, a special dust event "An American Haboob" in Arizona is discussed as it was predicted hours in advance because of the great improvement we have in numerical, land–atmosphere modeling, computing power and remote sensing of dust events. Therefore we performed a full numerical simulation to that event using the coupled atmospheric-dust model NMME–DREAM after generating a mask of the potentially dust productive regions using land cover and vegetation data obtained from satellites. Climate change also contributes to the deterioration of different marine habitats. In that regard we are also presenting some work dealing with change detection analysis of Marine Habitats over the city of Hurghada, Red Sea, Egypt. The motivation for this work came from the fact that coral reefs at Hurghada have undergone significant decline. They are damaged, displaced, polluted, stepped on, and blasted off, in addition to the effects of climate change on the reefs. One of the most pressing issues affecting reef health is mass coral bleaching that result from an interaction between human activities and climatic changes. Over another location, namely California, we have observed that it exhibits highly-variable amounts of precipitation across many timescales, from the hourly to the climate timescale. Frequently, heavy precipitation occurs, causing damage to property and life (floods, landslides, etc.). These extreme events, variability, and the lack of good, medium to long-range predictability of precipitation are already a challenge to those who manage wetlands, coastal infrastructure, agriculture and fresh water supply. Adding on to the current challenges for long-range planning is climate change issue. It is known that La Niña and El Niño affect precipitation patterns, which in turn are entwined with global climate patterns. We have studied ENSO impact on precipitation variability over different climate divisions in California. On the other hand the Nile Delta has experienced lately an increase in the underground water table as well as water logging, bogging and soil salinization. Those impacts would pose a major threat to the Delta region inheritance and existing communities. There has been an undergoing effort to address those vulnerabilities by looking into many adaptation strategies.

Keywords: remote sensing, modeling, long range transport, dust storms, North Africa, Gulf Region, India, California, climate extremes, sea level rise, coral reefs

Procedia PDF Downloads 486
15 Hydrocarbon Source Rocks of the Maragh Low

Authors: Elhadi Nasr, Ibrahim Ramadan

Abstract:

Biostratigraphical analyses of well sections from the Maragh Low in the Eastern Sirt Basin has allowed high resolution correlations to be undertaken. Full integration of this data with available palaeoenvironmental, lithological, gravity, seismic, aeromagnetic, igneous, radiometric and wireline log information and a geochemical analysis of source rock quality and distribution has led to a more detailed understanding of the geological and the structural history of this area. Pre Sirt Unconformity two superimposed rifting cycles have been identified. The oldest is represented by the Amal Group of sediments and is of Late Carboniferous, Kasimovian / Gzelian to Middle Triassic, Anisian age. Unconformably overlying is a younger rift cycle which is represented the Sarir Group of sediments and is of Early Cretaceous, late Neocomian to Aptian in age. Overlying the Sirt Unconformity is the marine Late Cretaceous section. An assessment of pyrolysis results and a palynofacies analysis has allowed hydrocarbon source facies and quality to be determined. There are a number of hydrocarbon source rock horizons in the Maragh Low, these are sometimes vertically stacked and they are of fair to excellent quality. The oldest identified source rock is the Triassic Shale, this unit is unconformably overlain by sandstones belonging to the Sarir Group and conformably overlies a Triassic Siltstone unit. Palynological dating of the Triassic Shale unit indicates a Middle Triassic, Anisian age. The Triassic Shale is interpreted to have been deposited in a lacustrine palaeoenvironment. This particularly is evidenced by the dark, fine grained, organic rich nature of the sediment and is supported by palynofacies analysis and by the recovery of fish fossils. Geochemical analysis of the Triassic Shale indicates total organic carbon varying between 1.37 and 3.53. S2 pyrolysate yields vary between 2.15 mg/g and 6.61 mg/g and hydrogen indices vary between 156.91 and 278.91. The source quality of the Triassic Shale varies from being of fair to very good / rich. Linked to thermal maturity it is now a very good source for light oil and gas. It was once a very good to rich oil source. The Early Barremian Shale was also deposited in a lacustrine palaeoenvironment. Recovered palynomorphs indicate an Early Cretaceous, late Neocomian to early Barremian age. The Early Barremian Shale is conformably underlain and overlain by sandstone units belonging to the Sarir Group of sediments which are also of Early Cretaceous age. Geochemical analysis of the Early Barremian Shale indicates that it is a good oil source and was originally very good. Total organic carbon varies between 3.59% and 7%. S2 varies between 6.30 mg/g and 10.39 mg/g and the hydrogen indices vary between 148.4 and 175.5. A Late Barremian Shale unit of this age has also been identified in the central Maragh Low. Geochemical analyses indicate that total organic carbon varies between 1.05 and 2.38%, S2 pyrolysate between 1.6 and 5.34 mg/g and the hydrogen index between 152.4 and 224.4. It is a good oil source rock which is now mature. In addition to the non marine hydrocarbon source rocks pre Sirt Unconformity, three formations in the overlying Late Cretaceous section also provide hydrocarbon quality source rocks. Interbedded shales within the Rachmat Formation of Late Cretaceous, early Campanian age have total organic carbon ranging between, 0.7 and 1.47%, S2 pyrolysate varying between 1.37 and 4.00 mg/g and hydrogen indices varying between 195.7 and 272.1. The indication is that this unit would provide a fair gas source to a good oil source. Geochemical analyses of the overlying Tagrifet Limestone indicate that total organic carbon varies between 0.26% and 1.01%. S2 pyrolysate varies between 1.21 and 2.16 mg/g and hydrogen indices vary between 195.7 and 465.4. For the overlying Sirt Shale Formation of Late Cretaceous, late Campanian age, total organic carbon varies between 1.04% and 1.51%, S2 pyrolysate varies between 4.65 mg/g and 6.99 mg/g and the hydrogen indices vary between 151 and 462.9. The study has proven that both the Sirt Shale Formation and the Tagrifet Limestone are good to very good and rich sources for oil in the Maragh Low. High resolution biostratigraphical interpretations have been integrated and calibrated with thermal maturity determinations (Vitrinite Reflectance (%Ro), Spore Colour Index (SCI) and Tmax (ºC) and the determined present day geothermal gradient of 25ºC / Km for the Maragh Low. Interpretation of generated basin modelling profiles allows a detailed prediction of timing of maturation development of these source horizons and leads to a determination of amounts of missing section at major unconformities. From the results the top of the oil window (0.72% Ro) is picked as high as 10,700’ and the base of the oil window (1.35% Ro) assuming a linear trend and by projection is picked as low as 18,000’ in the Maragh Low. For the Triassic Shale the early phase of oil generation was in the Late Palaeocene / Early to Middle Eocene and the main phase of oil generation was in the Middle to Late Eocene. The Early Barremian Shale reached the main phase of oil generation in the Early Oligocene with late generation being reached in the Middle Miocene. For the Rakb Group section (Rachmat Formation, Tagrifet Limestone and Sirt Shale Formation) the early phase of oil generation started in the Late Eocene with the main phase of generation being between the Early Oligocene and the Early Miocene. From studying maturity profiles and from regional considerations it can be predicted that up to 500’ of sediment may have been deposited and eroded by the Sirt Unconformity in the central Maragh Low while up to 2000’ of sediment may have been deposited and then eroded to the south of the trough.

Keywords: Geochemical analysis of the source rocks from wells in Eastern Sirt Basin.

Procedia PDF Downloads 406
14 Unleashing Potential in Pedagogical Innovation for STEM Education: Applying Knowledge Transfer Technology to Guide a Co-Creation Learning Mechanism for the Lingering Effects Amid COVID-19

Authors: Lan Cheng, Harry Qin, Yang Wang

Abstract:

Background: COVID-19 has induced the largest digital learning experiment in history. There is also emerging research evidence that students have paid a high cost of learning loss from virtual learning. University-wide survey results demonstrate that digital learning remains difficult for students who struggle with learning challenges, isolation, or a lack of resources. Large-scale efforts are therefore increasingly utilized for digital education. To better prepare students in higher education for this grand scientific and technological transformation, STEM education has been prioritized and promoted as a strategic imperative in the ongoing curriculum reform essential for unfinished learning needs and whole-person development. Building upon five key elements identified in the STEM education literature: Problem-based Learning, Community and Belonging, Technology Skills, Personalization of Learning, Connection to the External Community, this case study explores the potential of pedagogical innovation that integrates computational and experimental methodologies to support, enrich, and navigate STEM education. Objectives: The goal of this case study is to create a high-fidelity prototype design for STEM education with knowledge transfer technology that contains a Cooperative Multi-Agent System (CMAS), which has the objectives of (1) conduct assessment to reveal a virtual learning mechanism and establish strategies to facilitate scientific learning engagement, accessibility, and connection within and beyond university setting, (2) explore and validate an interactional co-creation approach embedded in project-based learning activities under the STEM learning context, which is being transformed by both digital technology and student behavior change,(3) formulate and implement the STEM-oriented campaign to guide learning network mapping, mitigate the loss of learning, enhance the learning experience, scale-up inclusive participation. Methods: This study applied a case study strategy and a methodology informed by Social Network Analysis Theory within a cross-disciplinary communication paradigm (students, peers, educators). Knowledge transfer technology is introduced to address learning challenges and to increase the efficiency of Reinforcement Learning (RL) algorithms. A co-creation learning framework was identified and investigated in a context-specific way with a learning analytic tool designed in this study. Findings: The result shows that (1) CMAS-empowered learning support reduced students’ confusion, difficulties, and gaps during problem-solving scenarios while increasing learner capacity empowerment, (2) The co-creation learning phenomenon have examined through the lens of the campaign and reveals that an interactive virtual learning environment fosters students to navigate scientific challenge independently and collaboratively, (3) The deliverables brought from the STEM educational campaign provide a methodological framework both within the context of the curriculum design and external community engagement application. Conclusion: This study brings a holistic and coherent pedagogy to cultivates students’ interest in STEM and develop them a knowledge base to integrate and apply knowledge across different STEM disciplines. Through the co-designing and cross-disciplinary educational content and campaign promotion, findings suggest factors to empower evidence-based learning practice while also piloting and tracking the impact of the scholastic value of co-creation under the dynamic learning environment. The data nested under the knowledge transfer technology situates learners’ scientific journey and could pave the way for theoretical advancement and broader scientific enervators within larger datasets, projects, and communities.

Keywords: co-creation, cross-disciplinary, knowledge transfer, STEM education, social network analysis

Procedia PDF Downloads 114
13 The Impact of Neighborhood Effects on the Economic Mobility of the Inhabitants of Three Segregated Communities in Salvador (Brazil)

Authors: Stephan Treuke

Abstract:

The paper analyses the neighbourhood effects on the economic mobility of the inhabitants of three segregated communities of Salvador (Brazil), in other words, the socio-economic advantages and disadvantages affecting the lives of poor people due to their embeddedness in specific socio-residential contexts. Recent studies performed in Brazilian metropolis have concentrated on the structural dimensions of negative externalities in order to explain neighbourhood-level variations in a field of different phenomena (delinquency, violence, access to the labour market and education) in spatial isolated and socially homogeneous slum areas (favelas). However, major disagreement remains whether the contiguity between residents of poor neighbourhoods and higher-class condominio-dwellers provides structures of opportunities or whether it fosters socio-spatial stigmatization. Based on a set of interviews, investigating the variability of interpersonal networks and their activation in the struggle for economic inclusion, the study confirms that the proximity of Nordeste de Amaralina to middle-/upper-class communities affects positively the access to labour opportunities. Nevertheless, residential stigmatization, as well as structures of social segmentation, annihilate these potentials. The lack of exposition to individuals and groups extrapolating from the favela’s social, educational and cultural context restricts the structures of opportunities to local level. Therefore, residents´ interpersonal networks reveal a high degree of redundancy and localism, based on bonding ties connecting family and neighbourhood members. The resilience of segregational structures in Plataforma contributes to the naturalization of social distance patters. It’s embeddedness in a socially homogeneous residential area (Subúrbio Ferroviário), growing informally and beyond official urban politics, encourages the construction of isotopic patterns of sociability, sharing the same values, social preferences, perspectives and behaviour models. Whereas it’s spatial isolation correlates with the scarcity of economic opportunities, the social heterogeneity of Fazenda Grande II interviewees and the socialising effects of public institutions mitigate the negative repercussions of segregation. The networks’ composition admits a higher degree of heterophilia and a greater proportion of bridging ties accounting for the access to broader information actives and facilitating economic mobility. The variability observed within the three different scenarios urges to reflect about the responsability of urban politics when it comes to the prevention or consolidation of the social segregation process in Salvador. Instead of promoting the local development of the favela Plataforma, public housing programs priorize technocratic habitational solutions without providing the residents’ socio-economic integration. The impact of negative externalities related to the homogeneously poor neighbourhood is potencialized in peripheral areas, turning its’ inhabitants socially invisible, thus being isolated from other social groups. The example of Nordeste de Amaralina portrays the failing interest of urban politics to bridge the social distances structuring the brazilian society’s rigid stratification model, founded on mecanisms of segmentation (unequal access to labour market and education system, public transport, social security and law protection) and generating permanent conflicts between the two socioeconomically distant groups living in geographic contiguity. Finally, in the case of Fazenda Grande II, the public investments in both housing projects and complementary infrastructure (e.g. schools, hospitals, community center, police stations, recreation areas) contributes to the residents’ socio-economic inclusion.

Keywords: economic mobility, neighborhood effects, Salvador, segregation

Procedia PDF Downloads 278
12 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 224
11 Influence of Oil Prices on the Central Caucasus State of Georgia

Authors: Charaia Vakhtang

Abstract:

Global oil prices are seeing new bottoms every day. The prices have already collapsed beneath the psychological verge of 30 USD. This tendency would be fully acceptable for the Georgian consumers, but there is one detail: two our neighboring countries (one friendly and one hostile) largely depend on resources of these hydrocarbons. Namely, the ratio of Azerbaijan in Georgia’s total FDI inflows in 2014 marked 20%. The ratio reached 40% in the January to September 2015. Azerbaijan is Georgia’s leading exports market. Namely, in 2014 Georgia’s exports to Azerbaijan constituted 544 million USD, i.e. 19% in Georgia’s total experts. In the January to November period of 2015, the ratio exceeded 11%. Moreover, Azerbaijan is Georgia’s strategic partner country as part of many regional projects that are designated for long-term perspectives. For example, the Baku-Tbilisi-Karsi railroad, the Black Sea terminal, preferential gas tariffs for Georgia and so on. The Russian economic contribution to the Georgian economy is also considerable, despite the losses the Russian hostile policy has inflicted to our country. Namely, Georgian emigrants are mainly employed in the Russian Federation and this category of Georgian citizens transfers considerable funds to Georgia every year. These transfers account for about 1 billion USD and consequently, these funds previously equalized to total FDI inflows. Moreover, despite the difficulties in the Russian market, Russia still remains a leader in terms of money transfers to Georgia. According to the last reports, money transfers from Russia to Georgia slipped by 276 million USD in 2015 compared to 2014 (-39%). At the same time, the total money transfers to Georgia in 2015 marked 1.08 billion USD, down 25% from 1.44 billion USD in 2014. This signifies the contraction in money transfers is by ¾ dependent on the Russian factor (in this case, contraction in oil prices and the Russian Ruble devaluation directly make negative impact on money transfers to Georgia). As to other countries, it is interesting that money transfers have also slipped from Italy (to 109 million USD from 121 million USD). Nevertheless, the country’s ratio in total money transfers to Georgia has increased to 10% from 8%. Money transfers to Georgia have increased by 22% (+18 million USD) from the USA. Money transfers have halved from Greece to 117 million USD from 205 million USD. As to Turkey, money transfers to Georgia from Turkey have increased by 1% to 69 million USD. Moreover, the problems with the national currencies of Russia and Azerbaijan, along with the above-mentioned developments, outline unfavorable perspectives for the Georgian economy. The depreciation of the national currencies of Azerbaijan and Russia is expected to bring unfavorable results for the Georgian economy. Even more so, the statement released by the Russian Finance Ministry on expected default is in direct relation to the welfare of the whole region and these tendencies will make direct and indirect negative impacts on Georgia’s economic indicators. Amid the economic slowdown in Armenia, Turkey and Ukraine, Georgia should try to enhance economic ties with comparatively stronger and flexible economies such as EU and USA. In other case, the Georgian economy will enter serious turbulent zone. We should make maximum benefit from the EU association agreement. It should be noted that the Russian economy slowdown that causes both regretful and happy moods in Georgia, will make negative impact on the Georgian economy. The same forecasts are made in relation to Azerbaijan. However, Georgia has many partner countries. Enhancement and development of the economic relations with these countries may maximally alleviate negative impacts from the declining economies. First of all, the EU association agreement should be mentioned as a main source for Georgia’s economic stabilization. It is the Georgian government‘s responsibility to successfully fulfill the EU association agreement requirements. In any case the imports must be replaced by domestic products and the exports should be stimulated through government support programs. The Authorities should ensure drawing more foreign investments and money resources, accumulating more tourism revenues and reducing external debts, budget expenditures should be balanced and the National Bank should carry out strict monetary policy. Moreover, the Government should develop a long-term state economic policy and carry out this policy at various Ministries. It is also of crucial importance to carry out constitutive policy and promote perspective directions on the domestic level.

Keywords: oil prices, economic growth, foreign direct investments, international trade

Procedia PDF Downloads 269
10 Light Sensitive Plasmonic Nanostructures for Photonic Applications

Authors: Istvan Csarnovics, Attila Bonyar, Miklos Veres, Laszlo Himics, Attila Csik, Judit Kaman, Julia Burunkova, Geza Szanto, Laszlo Balazs, Sandor Kokenyesi

Abstract:

In this work, the performance of gold nanoparticles were investigated for stimulation of photosensitive materials for photonic applications. It was widely used for surface plasmon resonance experiments, not in the last place because of the manifestation of optical resonances in the visible spectral region. The localized surface plasmon resonance is rather easily observed in nanometer-sized metallic structures and widely used for measurements, sensing, in semiconductor devices and even in optical data storage. Firstly, gold nanoparticles on silica glass substrate satisfy the conditions for surface plasmon resonance in the green-red spectral range, where the chalcogenide glasses have the highest sensitivity. The gold nanostructures influence and enhance the optical, structural and volume changes and promote the exciton generation in gold nanoparticles/chalcogenide layer structure. The experimental results support the importance of localized electric fields in the photo-induced transformation of chalcogenide glasses as well as suggest new approaches to improve the performance of these optical recording media. Results may be utilized for direct, micrometre- or submicron size geometrical and optical pattern formation and used also for further development of the explanations of these effects in chalcogenide glasses. Besides of that, gold nanoparticles could be added to the organic light-sensitive material. The acrylate-based materials are frequently used for optical, holographic recording of optoelectronic elements due to photo-stimulated structural transformations. The holographic recording process and photo-polymerization effect could be enhanced by the localized plasmon field of the created gold nanostructures. Finally, gold nanoparticles widely used for electrochemical and optical sensor applications. Although these NPs can be synthesized in several ways, perhaps one of the simplest methods is the thermal annealing of pre-deposited thin films on glass or silicon surfaces. With this method, the parameters of the annealing process (time, temperature) and the pre-deposited thin film thickness influence and define the resulting size and distribution of the NPs on the surface. Localized surface plasmon resonance (LSPR) is a very sensitive optical phenomenon and can be utilized for a large variety of sensing purposes (chemical sensors, gas sensors, biosensors, etc.). Surface-enhanced Raman spectroscopy (SERS) is an analytical method which can significantly increase the yield of Raman scattering of target molecules adsorbed on the surface of metallic nanoparticles. The sensitivity of LSPR and SERS based devices is strongly depending on the used material and also on the size and geometry of the metallic nanoparticles. By controlling these parameters the plasmon absorption band can be tuned and the sensitivity can be optimized. The technological parameters of the generated gold nanoparticles were investigated and influence on the SERS and on the LSPR sensitivity was established. The LSPR sensitivity were simulated for gold nanocubes and nanospheres with MNPBEM Matlab toolbox. It was found that the enhancement factor (which characterize the increase in the peak shift for multi-particle arrangements compared to single-particle models) depends on the size of the nanoparticles and on the distance between the particles. This work was supported by GINOP- 2.3.2-15-2016-00041 project, which is co-financed by the European Union and European Social Fund. Istvan Csarnovics is grateful for the support through the New National Excellence Program of the Ministry of Human Capacities, supported by the ÚNKP-17-4 Attila Bonyár and Miklós Veres are grateful for the support of the János Bolyai Research Scholarship of the Hungarian Academy of Sciences.

Keywords: light sensitive nanocomposites, metallic nanoparticles, photonic application, plasmonic nanostructures

Procedia PDF Downloads 304
9 Design and Construction of a Solar Dehydration System as a Technological Strategy for Food Sustainability in Difficult-to-Access Territories

Authors: Erika T. Fajardo-Ariza, Luis A. Castillo-Sanabria, Andrea Nieto-Veloza, Carlos M. Zuluaga-Domínguez

Abstract:

The growing emphasis on sustainable food production and preservation has driven the development of innovative solutions to minimize postharvest losses and improve market access for small-scale farmers. This project focuses on designing, constructing, and selecting materials for solar dryers in certain regions of Colombia where inadequate infrastructure limits access to major commercial hubs. Postharvest losses pose a significant challenge, impacting food security and farmer income. Addressing these losses is crucial for enhancing the value of agricultural products and supporting local economies. A comprehensive survey of local farmers revealed substantial challenges, including limited market access, inefficient transportation, and significant postharvest losses. For crops such as coffee, bananas, and citrus fruits, losses range from 0% to 50%, driven by factors like labor shortages, adverse climatic conditions, and transportation difficulties. To address these issues, the project prioritized selecting effective materials for the solar dryer. Various materials, recovered acrylic, original acrylic, glass, and polystyrene, were tested for their performance. The tests showed that recovered acrylic and glass were most effective in increasing the temperature difference between the interior and the external environment. The solar dryer was designed using Fusion 360® software (Autodesk, USA) and adhered to architectural guidelines from Architectural Graphic Standards. It features up to sixteen aluminum trays, each with a maximum load capacity of 3.5 kg, arranged in two levels to optimize drying efficiency. The constructed dryer was then tested with two locally available plant materials: green plantains (Musa paradisiaca L.) and snack bananas (Musa AA Simonds). To monitor performance, Thermo hygrometers and an Arduino system recorded internal and external temperature and humidity at one-minute intervals. Despite challenges such as adverse weather conditions and delays in local government funding, the active involvement of local producers was a significant advantage, fostering ownership and understanding of the project. The solar dryer operated under conditions of 31°C dry bulb temperature (Tbs), 55% relative humidity, and 21°C wet bulb temperature (Tbh). The drying curves showed a consistent drying period with critical moisture content observed between 200 and 300 minutes, followed by a sharp decrease in moisture loss, reaching an equilibrium point after 3,400 minutes. Although the solar dryer requires more time and is highly dependent on atmospheric conditions, it can approach the efficiency of an electric dryer when properly optimized. The successful design and construction of solar dryer systems in difficult-to-access areas represent a significant advancement in agricultural sustainability and postharvest loss reduction. By choosing effective materials such as recovered acrylic and implementing a carefully planned design, the project provides a valuable tool for local farmers. The initiative not only improves the quality and marketability of agricultural products but also offers broader environmental benefits, such as reduced reliance on fossil fuels and decreased waste. Additionally, it supports economic growth by enhancing the value of crops and potentially increasing farmer income. The successful implementation and testing of the dryer, combined with the engagement of local stakeholders, highlight its potential for replication and positive impact in similar contexts.

Keywords: drying technology, postharvest loss reduction, solar dryers, sustainable agriculture

Procedia PDF Downloads 27
8 Recent Trends in Transportable First Response Healthcare Architecture

Authors: Stephen Verderber

Abstract:

The World Health Organization (WHO) calls for research and development on ecologically sustainable, resilient structures capable of effectively responding to disaster events globally, in response to climate change, politically based diasporas, earthquakes, and other adverse events upending the rhythms of everyday life globally. By 2050, nearly 80% of the world’s population will reside in coastal zones, and this, coupled with the increasingly dire impacts of climate change, constitute a recipe for further chaos and disruption, and in light of these events, architects have yet to rise up to meet the challenge. In the arena of healthcare, rapidly deployable clinics and field hospitals can provide immediate assistance in medically underserved disaster strike zones. Transportable facilities offer multiple advantages over conventional, fixed-site hospitals, as lightweight, comparatively unencumbered alternatives. These attributes have been proven repeatedly in 20th century vehicular and tent-based structures deployed in frontline combat theaters and in prior natural disasters. Prefab transportable clinics and trauma centers recently responded adroitly to medical emergencies in the aftermath of the Haitian (2010) and Ecuadorian (2016) earthquakes, and in North American post-hurricane relief efforts (2017) while architects continue to be castigated by their engineer colleagues as chronically poor first responders. Architecturally based portable structures for healthcare currently include Redeployable Health Centers (RHCs), Redeployable Trauma Centers (RTCs), and Permanent Modular Installations (PMIs). Five tectonic variants within this typology have recently been operationalized in the field: 1. Vehicular-based Nomadics: Prefab modules installed on a truck chassis with interior compartments dropped in prior to final assembly. Alternately, a two-component apparatus is preferred, with a truck cab pulling a modular medical unit, with independent transiting component; 2. Tent and Pneumatic Systems: Tent/yurt precursors and inflatable systems lightweight and responsive to topographically challenging terrain and diverse climates; 3. Containerized Systems: The standard modular intermodal-shipping container affords structural strength, resiliency in difficult transiting conditions, and can be densely close-packed and these can be custom-built or hold flat-pack systems; 4. Flat-Packs and Pop-Up Systems: These kit-of-part assemblies are shipped in standardized or specially-designed ISO containers; and 5. Hybrid Systems: These consist of composite facilities representing a synthesis of mobile vehicular components and/or tent or shipping containers, fused with conventional or pneumatically activated tent systems. Hybrids are advantageous in many installation contexts from an aesthetic, fabrication, and transiting perspective. Advantages/disadvantages of various modular systems are comparatively examined, followed by presentation of a compendium of 80 evidence (research)-based planning and design considerations addressing site/context, transiting and commissioning, triage, decontamination/intake, diagnostic and treatment, facility tectonics, and administration/total environment. The benefits of offsite pre-manufactured fabrication are examined, as is anticipated growth in international demand for transportable healthcare facilities to meet the challenges posed by accelerating global climate change and global conflicts. This investigation into rapid response facilities for pre and post-disaster zones is drawn from a recent book by the author, the first on architecture on this topic (Innovations in Transportable Healthcare Architecture).

Keywords: disaster mitigation, rapid response healthcare architecture, offsite prefabrication

Procedia PDF Downloads 117
7 A Comprehensive Study of Spread Models of Wildland Fires

Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran

Abstract:

These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.

Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling

Procedia PDF Downloads 81
6 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support

Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz

Abstract:

The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.

Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.

Procedia PDF Downloads 126
5 Revealing Celtic and Norse Mythological Depths through Dragon Age’s Tattoos and Narratives

Authors: Charles W. MacQuarrie, Rachel R. Tatro Duarte

Abstract:

This paper explores the representation of medieval identity within the world of games such as Dragon Age, Elden Ring, Hellblade: Senua’s sacrifice, fantasy role-playing games that draw effectively and problematically on Celtic and Norse mythologies. Focusing on tattoos, onomastics, and accent as visual and oral markers of status and ethnicity, this study analyzes how the game's interplay between mythology, character narratives, and visual storytelling enriches the themes and offers players an immersive, but sometimes baldly ahistorical, connection to ancient mythologies and contemporary digital storytelling. Dragon Age is a triple a game series, Hellblade Senua’s Sacrifice, and Elden Ring of gamers worldwide with its presentation of an idealized medieval world, inspired by the lore of Celtic and Norse mythologies. This paper sets out to explore the intricate relationships between tattoos, accent, and character narratives in the game, drawing parallels to themes,heroic figures and gods from Celtic and Norse mythologies. Tattoos as Mythic and Ethnic Markers: This study analyzes how tattoos in Dragon Age visually represent mythological elements from both Celtic and Norse cultures, serving as conduits of cultural identity and narratives. The nature of these tattoos reflects the slave, criminal, warrior associations made in classical and medieval literature, and some of the episodes concerning tattoos in the games have either close analogs or sources in literature. For example the elvish character Solas, in Dragon Age Inquisition, removes a slave tattoo from the face of a lower status elf in an episode that is reminiscent of Bridget removing the stigmata from Connallus in the Vita Prima of Saint Bridget Character Narratives: The paper examines how characters' personal narratives in the game parallel the archetypal journeys of Celtic heroes and Norse gods, with a focus on their relationships to mythic themes. In these games the Elves usually have Welsh or Irish accents, are close to nature, magically powerful, oppressed by apparently Anglo-Saxon humans and Norse dwarves, and these elves wear facial tattoos. The Welsh voices of fairies and demons is older than the reference in Shakespeare’s Merry Wives of Windsor or even the Anglo-Saxon Life of Saint Guthlac. The English speaking world, and the fantasy genre of literature and gaming, undoubtedly driven by Tolkien, see Elves as Welsh speakers, and as having Welsh accents when speaking English Comparative Analysis: A comparative approach is employed to reveal connections, adaptations, and unique interpretations of the motifs of tattoos and narrative themes in Dragon Age, compared to those found in Celtic and Norse mythologies. Methodology: The study uses a comparative approach to examine the similarities and distinctions between Celtic and Norse mythologies and their counterparts in video games. The analysis encompasses character studies, narrative exploration, visual symbolism, and the historical context of Celtic and Norse cultures. Mythic Visuals: This study showcases how tattoos, as visual symbols, encapsulate mythic narratives, beliefs, and cultural identity, echoing Celtic and Norse visual motifs. Archetypal Journeys: The paper analyzes how character arcs mirror the heroic journeys of Celtic and Norse mythological figures, allowing players to engage with mythic narratives on a personal level. Cultural Interplay: The study discusses how the game's portrayal of tattoos and narratives both preserves and reinterprets elements from Celtic and Norse mythologies, fostering a connection between ancient cultures and modern digital storytelling. Conclusion: By exploring the interconnectedness of tattoos and character narratives in Dragon Age, this paper reveals the game series' ability to act as a bridge between ancient mythologies and contemporary gaming. By drawing inspiration from Celtic heroes and Norse gods and translating them into digital narratives and visual motifs, Dragon Age offers players a multi-dimensional engagement with mythic themes and a unique lens through which to appreciate the enduring allure of these cultures.

Keywords: comparative analysis, character narratives, video games and literature, tattoos, immersive storytelling, character development, mythological influences, Celtic mythology, Norset mythology

Procedia PDF Downloads 68
4 Effect of Inoculation with Consortia of Plant-Growth Promoting Bacteria on Biomass Production of the Halophyte Salicornia ramosissima

Authors: Maria João Ferreira, Natalia Sierra-Garcia, Javier Cremades, Carla António, Ana M. Rodrigues, Helena Silva, Ângela Cunha

Abstract:

Salicornia ramosissima, a halophyte that grows naturally in coastal areas of the northern hemisphere, is often considered the most promising halophyte candidate for extensive crop cultivation and saline agriculture practices. The expanding interest in this plant surpasses its use as gourmet food and includes their potential application as a source of bioactive compounds for the pharmaceutical industry. Despite growing well in saline soils, sustainable and ecologically friendly techniques to enhance crop production and the nutritional value of this plant are still needed. The root microbiome of S. ramosissima proved to be a source of taxonomically diverse plant growth-promoting bacteria (PGPB). Halotolerant strains of Bacillus, Salinicola, Pseudomonas, and Brevibacterium, among other genera, exhibit a broad spectrum of plant-growth promotion traits [e.g., 3-indole acetic acid (IAA), 1-aminocyclopropane-1-carboxylic acid (ACC) deaminase, siderophores, phosphate solubilization, Nitrogen fixation] and express a wide range of extracellular enzyme activities. In this work, three plant growth-promoting bacteria strains (Brevibacterium casei EB3, Pseudomonas oryzihabitans RL18, and Bacillus aryabhattai SP20) isolated from the rhizosphere and the endosphere of S. ramosissima roots from different saltmarshes along the Portuguese coast were inoculated in S. ramosissima seeds. Plants germinated from inoculated seeds were grown for three months in pots filled with a mixture of perlite and estuarine sediment (1:1) in greenhouse conditions and later transferred to a growth chamber, where they were maintained two months with controlled photoperiod, temperature, and humidity. Pots were placed on trays containing the irrigation solution (Hoagland’s solution 20% added with 10‰ marine salt). Before reaching the flowering stage, plants were collected, and the fresh and dry weight of aerial parts was determined. Non-inoculated seeds were used as a negative control. Selected dried stems from the most promising treatments were later analyzed by GC-TOF-MS for primary metabolite composition. The efficiency of inoculation and persistence of the inoculum was assessed by Next Generation Sequencing. Inoculations with single strain EB3 and co-inoculations with EB3+RL18 and EB3+RL18+SP20 (All treatment) resulted in significantly higher biomass production (fresh and dry weight) compared to non-inoculated plants. Considering fresh weight alone, inoculation with isolates SP20 and RL18 also caused a significant positive effect. Combined inoculation with the consortia SP20+EB3 or SP20+RL18 did not significantly improve biomass production. The analysis of the profile of primary metabolites will provide clues on the mechanisms by which the growth-enhancement effect of the inoculants operates in the plants. These results sustain promising prospects for the use of rhizospheric and endophytic PGPB as biofertilizers, reducing environmental impacts and operational costs of agrochemicals and contributing to the sustainability and cost-effectiveness of saline agriculture. Acknowledgments: This work was supported by project Rhizomis PTDC/BIA-MIC/29736/2017 financed by Fundação para a Ciência e Tecnologia (FCT) through the Regional Operational Program of the Center (02/SAICT/2017) with FEDER funds (European Regional Development Fund, FNR, and OE) and by FCT through CESAM (UIDP/50017/2020 + UIDB/50017/2020), LAQV-REQUIMTE (UIDB/50006/2020). We also acknowledge FCT/FSE for the financial support to Maria João Ferreira through a PhD grant (PD/BD/150363/2019). We are grateful to Horta dos Peixinhos for their help and support during sampling and seed collection. We also thank Glória Pinto for her collaboration providing us the use of the growth chambers during the final months of the experiment and Enrique Mateos-Naranjo and Jennifer Mesa-Marín of the Departamento de Biología Vegetal y Ecología, the University of Sevilla for their advice regarding the growth of salicornia plants in greenhouse conditions.

Keywords: halophytes, PGPB, rhizosphere engineering, biofertilizers, primary metabolite profiling, plant inoculation, Salicornia ramosissima

Procedia PDF Downloads 157
3 Glycyrrhizic Acid Inhibits Lipopolysaccharide-Stimulated Bovine Fibroblast-Like Synoviocyte, Invasion through Suppression of TLR4/NF-κB-Mediated Matrix Metalloproteinase-9 Expression

Authors: Hosein Maghsoudi

Abstract:

Rheumatois arthritis (RA) is progressive inflammatory autoimmune diseases that primarily affect the joints, characterized by synovial hyperplasia and inflammatory cell infiltration, deformed and painful joints, which can lead tissue destruction, functional disability systemic complications, and early dead and socioeconomic costs. The cause of rheumatoid arthritis is unknown, but genetic and environmental factors are contributory and the prognosis is guarded. However, advances in understanding the pathogenesis of the disease have fostered the development of new therapeutics, with improved outcomes. The current treatment strategy, which reflects this progress, is to initiate aggressive therapy soon after diagnosis and to escalate the therapy, guided by an assessment of disease activity, in pursuit of clinical remission. The pathobiology of RA is multifaceted and involves T cells, B cells, fibroblast-like synoviocyte (FLSc) and the complex interaction of many pro-inflammatory cytokine. Novel biologic agents that target tumor necrosis or interlukin (IL)-1 and Il-6, in addition T- and B-cells inhibitors, have resulted in favorable clinical outcomes in patients with RA. Despite this, at least 30% of RA patients are résistance to available therapies, suggesting novel mediators should be identified that can target other disease-specific pathway or cell lineage. Among the inflammatory cell population that might participated in RA pathogenesis, FLSc are crucial in initiaing and driving RA in concert of cartilage and bone by secreting metalloproteinase (MMPs) into the synovial fluid and by direct invasion into extracellular matrix (ECM), further exacerbating joint damage. Invasion of fibroblast-like synoviocytes (FLSc) is critical in the pathogenesis of rheumatoid-arthritis. The metalloproteinase (MMPs) and activator of Toll-like receptor 4 (TLR4)/nuclear factor- κB pthway play a critical role in RA-FLS invasion induced by lipopolysaccharide (LPS). The present study aimed to explore the anti-invasion activity of Glycyrrhizic Acid as a pharmacologically safe phytochemical agent with potent anti-inflammatory properties on IL-1beta and TNF-alpha signalling pathways in Bovine fibroblast-like synoviocyte ex- vitro, on LPS-stimulated bovine FLS migration and invasion as well as MMP expression and explored the upstream signal transduction. Results showed that Glycyrrhizic Acid suppressed LPS-stimulated bovine FLS migration and invasion by inhibition MMP-9 expression and activity. In addition our results revealed that Glycyrrhizic Acid inhibited the transcriptional activity of MMP-9 by suppression the nbinding activity of NF- κB in the MMP-9 promoter pathway. The extract of licorice (Glycyrrhiza glabra L.) has been widely used for many centuries in the traditional Chinese medicine as native anti-allergic agent. Glycyrrhizin (GL), a triterpenoidsaponin, extracted from the roots of licorice is the most effective compound for inflammation and allergic diseases in human body. The biological and pharmacological studies revealed that GL possesses many pharmacological effects, such as anti-inflammatory, anti-viral and liver protective effects, and the biological effects, such as induction of cytokines (interferon-γ and IL-12), chemokines as well as extrathymic T and anti-type 2 T cells. GL is known in the traditional Chinese medicine for its anti-inflammatory effect, which is originally described by Finney in 1959. The mechanism of the GL-induced anti-inflammatory effect is based on different pathways of the GL-induced selective inhibition of the prostaglandin E2 production, the CK-II- mediated activation of both GL-binding lipoxygenas (gbLOX; 17) and PLA2, an anti-thrombin action of GL and production of the reactive oxygen species (ROS; GL exerts liver protection properties by inhibiting PLA2 or by the hydroxyl radical trapping action, leading to the lowering of serum alanine and aspartate transaminase levels. The present study was undertaken to examine the possible mechanism of anti-inflammatory properties GL on IL-1beta and TNF-alpha signalling pathways in bovine fibroblast-like synoviocyte ex-vivo, on LPS-stimulated bovine FLS migration and invasion as well as MMP expression and explored the upstream signal transduction. Our results clearly showed that treatment of bovine fibroblast-like synoviocyte with GL suppressed LPS-induced cell migration and invasion. Furthermore, it revealed that GL inhibited the transcription activity of MMP-9 by suppressing the binding activity of NF-κB in the MM-9 promoter. MMP-9 is an important ECM-degrading enzyme and overexpression of MMPs in important of RA-FLSs. LPS can stimulate bovine FLS to secret MMPs, and this induction is regulated at the transcription and translational levels. In this study, LPS treatment of bovine FLS caused an increase in MMP-2 and MMP-9 levels. The increase in MMP-9 expression and secretion was inhibited by ex- vitro. Furthermore, these effects were mimicked by MMP-9 siRNA. These result therefore indicate the the inhibition of LPS-induced bovine FLS invasion by GL occurs primarily by inhibiting MMP-9 expression and activity. Next we analyzed the functional significance of NF-κB transcription of MMP-9 activation in Bovine FLSs. Results from EMSA showed that GL suppressed LPS-induced NF-κB binding to the MMP-9 promotor, as NF-κB regulates transcriptional activation of multiple inflammatory cytokines, we predicted that GL might target NF-κB to suppress MMP-9 transcription by LPS. Myeloid differentiation-factor 88 (MyD88) and TIR-domain containing adaptor protein (TIRAP) are critical proteins in the LPS-induced NF-κB and apoptotic signaling pathways, GL inhibited the expression of TLR4 and MYD88. These results demonstrated that GL suppress LPS-induced MMP-9 expression through the inhibition of the induced TLR4/NFκB signaling pathway. Taken together, our results provide evidence that GL exerts anti-inflammatory effects by inhibition LPS-induced bovine FLSs migration and invasion, and the mechanisms may involve the suppression of TLR4/NFκB –mediated MMP-9 expression. Although further work is needed to clarify the complicated mechanism of GL-induced anti-invasion of bovine FLSs, GL might be used as a further anti-invasion drug with therapeutic efficacy in the treatment of immune-mediated inflammatory disease such as RA.

Keywords: glycyrrhizic acid, bovine fibroblast-like synoviocyte, tlr4/nf-κb, metalloproteinase-9

Procedia PDF Downloads 390
2 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 503
1 “MaxSALIVA”: A Nano-Sized Dual-Drug Delivery System for Salivary Gland Radioprotection and Repair in Head and Neck Cancer

Authors: Ziyad S. Haidar

Abstract:

Background: Saliva plays a major role in maintaining oral and dental health (consequently, general health and well-being). Where it normally bathes the oral cavity and acts as a clearing agent. This becomes more apparent when the amount and quality of salivare significantly reduced due to medications, salivary gland neoplasms, disorders such as Sjögren’s syndrome, and especially ionizing radiation therapy for tumors of the head and neck, the fifth most common malignancy worldwide, during which the salivary glands are included within the radiation field or zone. Clinically, patients affected by salivary gland dysfunction often opt to terminate their radiotherapy course prematurely because they become malnourished and experience a significant decrease in their quality of life. Accordingly, the development of an alternative treatment to restore or regenerate damaged salivary gland tissue is eagerly awaited. Likewise, the formulation of a radioprotection modality and early damage prevention strategy is also highly desirable. Objectives: To assess the pre-clinical radio-protective effect as well as the reparative/regenerative potential of layer-by-layer self-assembled lipid-polymer-based core-shell nanocapsules designed and fine-tuned in this experimental work for the sequential (ordered) release of dual cytokines, following a single local administration (direct injection) into a murine sub-mandibular salivary gland model of irradiation. Methods: The formulated core-shell nanocapsules were characterized by physical-chemical-mechanically pre-/post-loading with the drugs (in solution and powder formats), followed by optimizing the pharmaco-kinetic profile. Then, nanosuspensions were administered directly into the salivary glands, 24hrs pre-irradiation (PBS, un-loaded nanocapsules, and individual and combined vehicle-free cytokines were injected into the control glands for an in-depth comparative analysis). External irradiation at an elevated dose of 18Gy (revised from our previous 15Gy model) was exposed to the head-and-neck region of C57BL/6 mice. Salivary flow rate (un-stimulated) and salivary protein content/excretion were regularly assessed using an enzyme-linked immunosorbent assay (3-month period). Histological and histomorphometric evaluation and apoptosis/proliferation analysis followed by local versus systemic bio-distribution and immuno-histochemical assays were then performed on all harvested major organs (at the distinct experimental end-points). Results: Monodisperse, stable, and cytocompatible nanocapsules capable of maintaining the bioactivity of the encapsulant within the different compartments with the core and shell and with controlled/customizable pharmaco-kinetics, resulted, as is illustrated in the graphical abstract (Figure) below. The experimental animals demonstrated a significant increase in salivary flow rates when compared to the controls. Herein, salivary protein content was comparable to the pre-irradiation (baseline) level. Histomorphometry further confirmed the biocompatibility and localization of the nanocapsules, in vivo, into the site of injection. Acinar cells showed fewer vacuoles and nuclear aberration in the experimental group, while the amount of mucin was higher in controls. Overall, fewer apoptotic activities were detected by a Terminal deoxynucleotidyl Transferase (TdT) dUTP Nick-End Labeling (TUNEL) assay and proliferative rates were similar to the controls, suggesting an interesting reparative and regenerative potential of irradiation-damaged/-dysfunctional salivary glands. The Figure below exemplifies some of these findings. Conclusions: Biocompatible, reproducible, and customizable self-assembling layer-by-layer core-shell delivery system is formulated and presented. Our findings suggest that localized sequential bioactive delivery of dual cytokines (in specific dose and order) can prevent irradiation-induced damage via reducing apoptosis and also has the potential to promote in situ proliferation of salivary gland cells; maxSALIVA is scalable (Good Manufacturing Practice or GMP production for human clinical trials) and patent-pending.

Keywords: saliva, head and neck cancer, nanotechnology, controlled drug delivery, xerostomia, mucositis, biopolymers, innovation

Procedia PDF Downloads 85