Search results for: axial and lateral resolution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2651

Search results for: axial and lateral resolution

11 Finite Element Method (FEM) Simulation, design and 3D Print of Novel Highly Integrated PV-TEG Device with Improved Solar Energy Harvest Efficiency

Authors: Jaden Lu, Olivia Lu

Abstract:

Despite the remarkable advancement of solar cell technology, the challenge of optimizing total solar energy harvest efficiency persists, primarily due to significant heat loss. This excess heat not only diminishes solar panel output efficiency but also curtails its operational lifespan. A promising approach to address this issue is the conversion of surplus heat into electricity. In recent years, there is growing interest in the use of thermoelectric generators (TEG) as a potential solution. The integration of efficient TEG devices holds the promise of augmenting overall energy harvest efficiency while prolonging the longevity of solar panels. While certain research groups have proposed the integration of solar cells and TEG devices, a substantial gap between conceptualization and practical implementation remains, largely attributed to low thermal energy conversion efficiency of TEG devices. To bridge this gap and meet the requisites of practical application, a feasible strategy involves the incorporation of a substantial number of p-n junctions within a confined unit volume. However, the manufacturing of high-density TEG p-n junctions presents a formidable challenge. The prevalent solution often leads to large device sizes to accommodate enough p-n junctions, consequently complicating integration with solar cells. Recently, the adoption of 3D printing technology has emerged as a promising solution to address this challenge by fabricating high-density p-n arrays. Despite this, further developmental efforts are necessary. Presently, the primary focus is on the 3D printing of vertically layered TEG devices, wherein p-n junction density remains constrained by spatial limitations and the constraints of 3D printing techniques. This study proposes a novel device configuration featuring horizontally arrayed p-n junctions of Bi2Te3. The structural design of the device is subjected to simulation through the Finite Element Method (FEM) within COMSOL Multiphysics software. Various device configurations are simulated to identify optimal device structure. Based on the simulation results, a new TEG device is fabricated utilizing 3D Selective laser melting (SLM) printing technology. Fusion 360 facilitates the translation of the COMSOL device structure into a 3D print file. The horizontal design offers a unique advantage, enabling the fabrication of densely packed, three-dimensional p-n junction arrays. The fabrication process entails printing a singular row of horizontal p-n junctions using the 3D SLM printing technique in a single layer. Subsequently, successive rows of p-n junction arrays are printed within the same layer, interconnected by thermally conductive copper. This sequence is replicated across multiple layers, separated by thermal insulating glass. This integration created in a highly compact three-dimensional TEG device with high density p-n junctions. The fabricated TEG device is then attached to the bottom of the solar cell using thermal glue. The whole device is characterized, with output data closely matching with COMSOL simulation results. Future research endeavors will encompass the refinement of thermoelectric materials. This includes the advancement of high-resolution 3D printing techniques tailored to diverse thermoelectric materials, along with the optimization of material microstructures such as porosity and doping. The objective is to achieve an optimal and highly integrated PV-TEG device that can substantially increase the solar energy harvest efficiency.

Keywords: thermoelectric, finite element method, 3d print, energy conversion

Procedia PDF Downloads 35
10 Pharmacophore-Based Modeling of a Series of Human Glutaminyl Cyclase Inhibitors to Identify Lead Molecules by Virtual Screening, Molecular Docking and Molecular Dynamics Simulation Study

Authors: Ankur Chaudhuri, Sibani Sen Chakraborty

Abstract:

In human, glutaminyl cyclase activity is highly abundant in neuronal and secretory tissues and is preferentially restricted to hypothalamus and pituitary. The N-terminal modification of β-amyloids (Aβs) peptides by the generation of a pyro-glutamyl (pGlu) modified Aβs (pE-Aβs) is an important process in the initiation of the formation of neurotoxic plaques in Alzheimer’s disease (AD). This process is catalyzed by glutaminyl cyclase (QC). The expression of QC is characteristically up-regulated in the early stage of AD, and the hallmark of the inhibition of QC is the prevention of the formation of pE-Aβs and plaques. A computer-aided drug design (CADD) process was employed to give an idea for the designing of potentially active compounds to understand the inhibitory potency against human glutaminyl cyclase (QC). This work elaborates the ligand-based and structure-based pharmacophore exploration of glutaminyl cyclase (QC) by using the known inhibitors. Three dimensional (3D) quantitative structure-activity relationship (QSAR) methods were applied to 154 compounds with known IC50 values. All the inhibitors were divided into two sets, training-set, and test-sets. Generally, training-set was used to build the quantitative pharmacophore model based on the principle of structural diversity, whereas the test-set was employed to evaluate the predictive ability of the pharmacophore hypotheses. A chemical feature-based pharmacophore model was generated from the known 92 training-set compounds by HypoGen module implemented in Discovery Studio 2017 R2 software package. The best hypothesis was selected (Hypo1) based upon the highest correlation coefficient (0.8906), lowest total cost (463.72), and the lowest root mean square deviation (2.24Å) values. The highest correlation coefficient value indicates greater predictive activity of the hypothesis, whereas the lower root mean square deviation signifies a small deviation of experimental activity from the predicted one. The best pharmacophore model (Hypo1) of the candidate inhibitors predicted comprised four features: two hydrogen bond acceptor, one hydrogen bond donor, and one hydrophobic feature. The Hypo1 was validated by several parameters such as test set activity prediction, cost analysis, Fischer's randomization test, leave-one-out method, and heat map of ligand profiler. The predicted features were then used for virtual screening of potential compounds from NCI, ASINEX, Maybridge and Chembridge databases. More than seven million compounds were used for this purpose. The hit compounds were filtered by drug-likeness and pharmacokinetics properties. The selective hits were docked to the high-resolution three-dimensional structure of the target protein glutaminyl cyclase (PDB ID: 2AFU/2AFW) to filter these hits further. To validate the molecular docking results, the most active compound from the dataset was selected as a reference molecule. From the density functional theory (DFT) study, ten molecules were selected based on their highest HOMO (highest occupied molecular orbitals) energy and the lowest bandgap values. Molecular dynamics simulations with explicit solvation systems of the final ten hit compounds revealed that a large number of non-covalent interactions were formed with the binding site of the human glutaminyl cyclase. It was suggested that the hit compounds reported in this study could help in future designing of potent inhibitors as leads against human glutaminyl cyclase.

Keywords: glutaminyl cyclase, hit lead, pharmacophore model, simulation

Procedia PDF Downloads 110
9 Development of Anti-Fouling Surface Features Bioinspired by the Patterned Micro-Textures of the Scophthalmus rhombus (Brill)

Authors: Ivan Maguire, Alan Barrett, Alex Forte, Sandra Kwiatkowska, Rohit Mishra, Jens Ducrèe, Fiona Regan

Abstract:

Biofouling is defined as the gradual accumulation of Biomimetics refers to the use and imitation of principles copied from nature. Biomimetics has found interest across many commercial disciplines. Among many biological objects and their functions, aquatic animals deserve a special attention due to their antimicrobial capabilities resulting from chemical composition, surface topography or other behavioural defences, which can be used as an inspiration for antifouling technology. Marine biofouling has detrimental effects on seagoing vessels, both commercial and leisure, as well as on oceanographic sensors, offshore drilling rigs, and aquaculture installations. Sensor optics, membranes, housings and platforms can become fouled leading to problems with sensor performance and data integrity. While many anti-fouling solutions are currently being investigated as a cost-cutting measure, biofouling settlement may also be prevented by creating a surface that does not satisfy the settlement conditions. Brill (Scophthalmus rhombus) is a small flatfish occurring in marine waters of Mediterranean as well as Norway and Iceland. It inhabits sandy and muddy coastal waters from 5 to 80 meters. Its skin colour changes depending on environment, but generally is brownish with light and dark freckles, with creamy underside. Brill is oval in shape and its flesh is white. The aim of this study is to translate the unique micro-topography of the brill scale, to design marine inspired biomimetic surface coating and test it against a typical fouling organism. Following extensive study of scale topography of the brill fish (Scophthalmus rhombus) and the settlement behaviour of the diatom species Psammodictyon sp. via SEM, two state-of-the-art antifouling surface solutions were designed and investigated; A brill fish scale bioinspired surface pattern platform (BFD), and generic and uniformly-arrayed, circular micropillar platform (MPD), with offsets based on diatom species settlement behaviour. The BFD approach consists of different ~5 μm by ~90 μm Brill-replica patterns, grown to a 5 μm height, in a linear array pattern. The MPD approach utilises hexagonal-packed cylindrical pillars 10.6 μm in diameter, grown to a height of 5 μm, with vertical offset of 15 μm and horizontal offset of 26.6 μm. Photolithography was employed for microstructure growth, with a polydimethylsiloxane (PDMS) chip-based used as a testbed for diatom adhesion on both platforms. Settlement and adhesion tests were performed using this PDMS microfluidic chip through subjugation to centrifugal force via an in-house developed ‘spin-stand’ which features a motor, in combination with a high-resolution camera, for real-time observing diatom release from PDMS material. Diatom adhesion strength can therefore be determined based on the centrifugal force generated at varying rotational speeds. It is hoped that both the replica and bio-inspired solutions will give comparable anti-fouling results to these synthetic surfaces, whilst also assisting in determining whether anti-fouling solutions should predominantly be investigating either fully bioreplica-based, or a bioinspired, synthetically-based design.

Keywords: anti-fouling applications, bio-inspired microstructures, centrifugal microfluidics, surface modification

Procedia PDF Downloads 289
8 ARGO: An Open Designed Unmanned Surface Vehicle Mapping Autonomous Platform

Authors: Papakonstantinou Apostolos, Argyrios Moustakas, Panagiotis Zervos, Dimitrios Stefanakis, Manolis Tsapakis, Nektarios Spyridakis, Mary Paspaliari, Christos Kontos, Antonis Legakis, Sarantis Houzouris, Konstantinos Topouzelis

Abstract:

For years unmanned and remotely operated robots have been used as tools in industry research and education. The rapid development and miniaturization of sensors that can be attached to remotely operated vehicles in recent years allowed industry leaders and researchers to utilize them as an affordable means for data acquisition in air, land, and sea. Despite the recent developments in the ground and unmanned airborne vehicles, a small number of Unmanned Surface Vehicle (USV) platforms are targeted for mapping and monitoring environmental parameters for research and industry purposes. The ARGO project is developed an open-design USV equipped with multi-level control hardware architecture and state-of-the-art sensors and payloads for the autonomous monitoring of environmental parameters in large sea areas. The proposed USV is a catamaran-type USV controlled over a wireless radio link (5G) for long-range mapping capabilities and control for a ground-based control station. The ARGO USV has a propulsion control using 2x fully redundant electric trolling motors with active vector thrust for omnidirectional movement, navigation with opensource autopilot system with high accuracy GNSS device, and communication with the 2.4Ghz digital link able to provide 20km of Line of Sight (Los) range distance. The 3-meter dual hull design and composite structure offer well above 80kg of usable payload capacity. Furthermore, sun and friction energy harvesting methods provide clean energy to the propulsion system. The design is highly modular, where each component or payload can be replaced or modified according to the desired task (industrial or research). The system can be equipped with Multiparameter Sonde, measuring up to 20 water parameters simultaneously, such as conductivity, salinity, turbidity, dissolved oxygen, etc. Furthermore, a high-end multibeam echo sounder can be installed in a specific boat datum for shallow water high-resolution seabed mapping. The system is designed to operate in the Aegean Sea. The developed USV is planned to be utilized as a system for autonomous data acquisition, mapping, and monitoring bathymetry and various environmental parameters. ARGO USV can operate in small or large ports with high maneuverability and endurance to map large geographical extends at sea. The system presents state of the art solutions in the following areas i) the on-board/real-time data processing/analysis capabilities, ii) the energy-independent and environmentally friendly platform entirely made using the latest aeronautical and marine materials, iii) the integration of advanced technology sensors, all in one system (photogrammetric and radiometric footprint, as well as its connection with various environmental and inertial sensors) and iv) the information management application. The ARGO web-based application enables the system to depict the results of the data acquisition process in near real-time. All the recorded environmental variables and indices are presented, allowing users to remotely access all the raw and processed information using the implemented web-based GIS application.

Keywords: monitor marine environment, unmanned surface vehicle, mapping bythometry, sea environmental monitoring

Procedia PDF Downloads 89
7 Risks for Cyanobacteria Harmful Algal Blooms in Georgia Piedmont Waterbodies Due to Land Management and Climate Interactions

Authors: Sam Weber, Deepak Mishra, Susan Wilde, Elizabeth Kramer

Abstract:

The frequency and severity of cyanobacteria harmful blooms (CyanoHABs) have been increasing over time, with point and non-point source eutrophication and shifting climate paradigms being blamed as the primary culprits. Excessive nutrients, warm temperatures, quiescent water, and heavy and less regular rainfall create more conducive environments for CyanoHABs. CyanoHABs have the potential to produce a spectrum of toxins that cause gastrointestinal stress, organ failure, and even death in humans and animals. To promote enhanced, proactive CyanoHAB management, risk modeling using geospatial tools can act as predictive mechanisms to supplement current CyanoHAB monitoring, management and mitigation efforts. The risk maps would empower water managers to focus their efforts on high risk water bodies in an attempt to prevent CyanoHABs before they occur, and/or more diligently observe those waterbodies. For this research, exploratory spatial data analysis techniques were used to identify the strongest predicators for CyanoHAB blooms based on remote sensing-derived cyanobacteria cell density values for 771 waterbodies in the Georgia Piedmont and landscape characteristics of their watersheds. In-situ datasets for cyanobacteria cell density, nutrients, temperature, and rainfall patterns are not widely available, so free gridded geospatial datasets were used as proxy variables for assessing CyanoHAB risk. For example, the percent of a watershed that is agriculture was used as a proxy for nutrient loading, and the summer precipitation within a watershed was used as a proxy for water quiescence. Cyanobacteria cell density values were calculated using atmospherically corrected images from the European Space Agency’s Sentinel-2A satellite and multispectral instrument sensor at a 10-meter ground resolution. Seventeen explanatory variables were calculated for each watershed utilizing the multi-petabyte geospatial catalogs available within the Google Earth Engine cloud computing interface. The seventeen variables were then used in a multiple linear regression model, and the strongest predictors of cyanobacteria cell density were selected for the final regression model. The seventeen explanatory variables included land cover composition, winter and summer temperature and precipitation data, topographic derivatives, vegetation index anomalies, and soil characteristics. Watershed maximum summer temperature, percent agriculture, percent forest, percent impervious, and waterbody area emerged as the strongest predictors of cyanobacteria cell density with an adjusted R-squared value of 0.31 and a p-value ~ 0. The final regression equation was used to make a normalized cyanobacteria cell density index, and a Jenks Natural Break classification was used to assign waterbodies designations of low, medium, or high risk. Of the 771 waterbodies, 24.38% were low risk, 37.35% were medium risk, and 38.26% were high risk. This study showed that there are significant relationships between free geospatial datasets representing summer maximum temperatures, nutrient loading associated with land use and land cover, and the area of a waterbody with cyanobacteria cell density. This data analytics approach to CyanoHAB risk assessment corroborated the literature-established environmental triggers for CyanoHABs, and presents a novel approach for CyanoHAB risk mapping in waterbodies across the greater southeastern United States.

Keywords: cyanobacteria, land use/land cover, remote sensing, risk mapping

Procedia PDF Downloads 183
6 Clinical Course and Prognosis of Cutaneous Manifestations of COVID-19: A Systematic Review of Reported Cases

Authors: Hilary Modir, Kyle Dutton, Michelle Swab, Shabnam Asghari

Abstract:

Since its emergence, the cutaneous manifestations of COVID-19 have been documented in the literature. However, the majority are case reports with significant limitations in appraisal quality, thus leaving the role of dermatological manifestations of COVID-19 erroneously underexplored. The primary aim of this review was to systematically examine clinical patterns of dermatological manifestations as reported in the literature. This study was designed as a systematic review of case reports. The inclusion criteria consisted of all published reports and articles regarding COVID-19 in English, from September 1st, 2019, until June 22nd, 2020. The population consisted of confirmed cases of COVID-19 with associated cutaneous signs and symptoms. Exclusion criteria included research in planning stages, protocols, book reviews, news articles, review studies, and policy analyses. With the collaboration of a librarian, a search strategy was created consisting of a mixture of keyword terms and controlled vocabulary. Electronic databases searched were MEDLINE via PubMed, EMBASE, CINAHL, Web of Science, LILACS, PsycINFO, WHO Global Literature on Coronavirus Disease, Cochrane Library, Campbell Collaboration, Prospero, WHO International Clinical Trials Registry Platform, Australian and New Zealand Clinical Trials Registry, U.S. Institutes of Health Ongoing Trials Register, AAD Registry, OSF preprints, SSRN, MedRxiV and BioRxiV. The study selection featured an initial pre-screening of titles and abstracts by one independent reviewer. Results were verified by re-examining a random sample of 1% of excluded articles. Eligible studies progressed for full-text review by two calibrated independent reviewers. Covidence was used to store and extract data, such as citation information and findings pertaining to COVID-19 and cutaneous signs and symptoms. Data analysis and summarization methodology reflect the framework proposed by PRISMA and recommendations set out by Cochrane and Joanna Brigg’s Institute for conducting systematic reviews. The Oxford Centre for Evidence-Based Medicine’s level of evidence was used to appraise the quality of individual studies. The literature search revealed a total of 1221 articles. After the abstract and full-text screening, only 95 studies met the eligibility criteria, proceeding to data extraction. Studies were divided into 58% case reports and 42% series. A total of 833 manifestations were reported in 723 confirmed COVID-19 cases. The most frequent lesions were 23% maculopapular, 15% urticarial and 13% pseudo-chilblains, with 46% of lesions reporting pruritus, 16% erythema, 14% pain, 12% burning sensation, and 4% edema. The most common lesion locations were 20% trunk, 19.5% lower limbs, and 17.7% upper limbs. The time to resolution of lesions was between one and twenty-one days. In conclusion, over half of the reported cutaneous presentations in COVID-19 positive patients were maculopapular, urticarial and pseudo-chilblains, with the majority of lesions distributed to the extremities and trunk. As this review’s sample size only contained COVID-19 confirmed cases with skin presentations, it becomes difficult to deduce the direct relationship between skin findings and COVID-19. However, it can be correlated that acute onset of skin lesions, such as chilblains-like, may be associated with or may warrant consideration of COVID-19 as part of the differential diagnosis.

Keywords: COVID-19, cutaneous manifestations, cutaneous signs, general dermatology, medical dermatology, Sars-Cov-2, skin and infectious disease, skin findings, skin manifestations

Procedia PDF Downloads 156
5 Identifying the Conservation Gaps in Poorly Studied Protected Area in the Philippines: A Study Case of Sibuyan Island

Authors: Roven Tumaneng, Angelica Kristina Monzon, Ralph Sedricke Lapuz, Jose Don De Alban, Jennica Paula Masigan, Joanne Rae Pales, Laila Monera Pornel, Dennis Tablazon, Rizza Karen Veridiano, Jackie Lou Wenceslao, Edmund Leo Rico, Neil Aldrin Mallari

Abstract:

Most protected area management plans in the Philippines, particularly the smaller and more remote islands suffer from insufficient baseline data, which should provide the bases for formulating measureable conservation targets and appropriate management interventions for these protected areas. Attempts to synthesize available data particularly on cultural and socio-economic characteristic of local peoples within and outside protected areas also suffer from the lack of comprehensive and detailed inventories, which should be considered in designing adaptive management interventions to be used for those protected areas. Mt Guiting-guiting Natural Park (MGGNP) located in Sibuyan Island is one of the poorly studied protected areas in the Philippines. In this study, we determined the highly biologically important areas of the protected area using Maximum Entropy approach (MaxEnt) from environmental predictors (i.e., topographic, bioclimatic,land cover, and soil image layers) derived from global remotely sensed data and point occurrence data of species of birds and trees recorded during field surveys on the island. A total of 23 trigger species of birds and trees was modeled and stacked to generate species richness maps for biological high conservation value areas (HCVAs). Forest habitat change was delineated using dual-polarised L-band ALOS-PALSAR mosaic data at 25 meter spatial resolution, taken at two acquisition years 2007 and 2009 to provide information on forest cover ad habitat change in the island between year 2007 and 2009. Determining the livelihood guilds were also conducted using the data gathered from171 household interviews, from which demographic and livelihood variables were extracted (i.e., age, gender, number of household members, educational attainment, years of residency, distance from forest edge, main occupation, alternative sources of food and resources during scarcity months, and sources of these alternative resources).Using Principal Component Analysis (PCA) and Kruskal-Wallis test, the diversity and patterns of forest resource use by people in the island were determined with particular focus on the economic activities that directly and indirectly affect the population of key species as well as to identify levels of forest resource use by people in different areas of the park.Results showed that there are gaps in the area occupied by the natural park, as evidenced by the mismatch of the proposed HCVAs and the existing perimeters of the park. We found out that subsistence forest gathering was the possible main driver for forest degradation out of the eight livelihood guilds that were identified in the park. Determining the high conservation areas and identifyingthe anthropogenic factors that influence the species richness and abundance of key species in the different management zone of MGGNP would provide guidance for the design of a protected area management plan and future monitoring programs. However, through intensive communication and consultation with government stakeholders and local communities our results led to setting conservation targets in local development plans and serve as a basis for the reposition of the boundaries and reconfiguration of the management zones of MGGNP.

Keywords: conservation gaps, livelihood guilds, MaxEnt, protected area

Procedia PDF Downloads 379
4 Salmon Diseases Connectivity between Fish Farm Management Areas in Chile

Authors: Pablo Reche

Abstract:

Since 1980’s aquaculture has become the biggest economic activity in southern Chile, being Salmo salar and Oncorhynchus mykiss the main finfish species. High fish density makes both species prone to contract diseases, what drives the industry to big losses, affecting greatly the local economy. Three are the most concerning infective agents, the infectious salmon anemia virus (ISAv), the bacteria Piscirickettsia salmonis and the copepod Caligus rogercresseyi. To regulate the industry the government arranged the salmon farms within management areas named as barrios, which coordinate the fallowing periods and antibiotics treatments of their salmon farms. In turn, barrios are gathered into larger management areas, named as macrozonas whose purpose is to minimize the risk of disease transmission between them and to enclose the outbreaks within their boundaries. However, disease outbreaks still happen and transmission to neighbor sites enlarges the initial event. Salmon disease agents are mostly transported passively by local currents. Thus, to understand how transmission occurs it must be firstly studied the physical environment. In Chile, salmon farming takes place in the inner seas of the southernmost regions of western Patagonia, between 41.5ºS-55ºS. This coastal marine system is characterised by western winds, latitudinally modulated by the position of the South-Eats Pacific high-pressure centre, high precipitation rates and freshwater inflows from the numerous glaciers (including the largest ice cap out of Antarctic and Greenland). All of these forcings meet in a complex bathymetry and coastline system - deep fjords, shallow sills, narrow straits, channels, archipelagos, inlets, and isolated inner seas- driving an estuarine circulation (fast outflows westwards on surface and slow deeper inflows eastwards). Such a complex system is modelled on the numerical model MIKE3, upon whose 3D current fields particle-track-biological models (one for each infective agent) are decoupled. Each agent biology is parameterized by functions for maturation and mortality (reproduction not included). Such parameterizations are depending upon environmental factors, like temperature and salinity, so their lifespan will depend upon the environmental conditions those virtual agents encounter on their way while passively transported. CLIC (Connectivity-Langrangian–IFOP-Chile) is a service platform that supports the graphical visualization of the connectivity matrices calculated from the particle trajectories files resultant of the particle-track-biological models. On CLIC users can select, from a high-resolution grid (~1km), the areas the connectivity will be calculated between them. These areas can be barrios and macrozonas. Users also can select what nodes of these areas are allowed to release and scatter particles from, depth and frequency of the initial particle release, climatic scenario (winter/summer) and type of particle (ISAv, Piscirickettsia salmonis, Caligus rogercresseyi plus an option for lifeless particles). Results include probabilities downstream (where the particles go) and upstream (where the particles come from), particle age and vertical distribution, all of them aiming to understand how currently connectivity works to eventually propose a minimum risk zonation for aquaculture purpose. Preliminary results in Chiloe inner sea shows that the risk depends not only upon dynamic conditions but upon barrios location with respect to their neighbors.

Keywords: aquaculture zonation, Caligus rogercresseyi, Chilean Patagonia, coastal oceanography, connectivity, infectious salmon anemia virus, Piscirickettsia salmonis

Procedia PDF Downloads 131
3 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 201
2 Acute Severe Hyponatremia in Patient with Psychogenic Polydipsia, Learning Disability and Epilepsy

Authors: Anisa Suraya Ab Razak, Izza Hayat

Abstract:

Introduction: The diagnosis and management of severe hyponatremia in neuropsychiatric patients present a significant challenge to physicians. Several factors contribute, including diagnostic shadowing and attributing abnormal behavior to intellectual disability or psychiatric conditions. Hyponatraemia is the commonest electrolyte abnormality in the inpatient population, ranging from mild/asymptomatic, moderate to severe levels with life-threatening symptoms such as seizures, coma and death. There are several documented fatal case reports in the literature of severe hyponatremia secondary to psychogenic polydipsia, often diagnosed only in autopsy. This paper presents a case study of acute severe hyponatremia in a neuropsychiatric patient with early diagnosis and admission to intensive care. Case study: A 21-year old Caucasian male with known epilepsy and learning disability was admitted from residential living with generalized tonic-clonic self-terminating seizures after refusing medications for several weeks. Evidence of superficial head injury was detected on physical examination. His laboratory data demonstrated mild hyponatremia (125 mmol/L). Computed tomography imaging of his brain demonstrated no acute bleed or space-occupying lesion. He exhibited abnormal behavior - restlessness, drinking water from bathroom taps, inability to engage, paranoia, and hypersexuality. No collateral history was available to establish his baseline behavior. He was loaded with intravenous sodium valproate and leveritircaetam. Three hours later, he developed vomiting and a generalized tonic-clonic seizure lasting forty seconds. He remained drowsy for several hours and regained minimal recovery of consciousness. A repeat set of blood tests demonstrated profound hyponatremia (117 mmol/L). Outcomes: He was referred to intensive care for peripheral intravenous infusion of 2.7% sodium chloride solution with two-hourly laboratory monitoring of sodium concentration. Laboratory monitoring identified dangerously rapid correction of serum sodium concentration, and hypertonic saline was switched to a 5% dextrose solution to reduce the risk of acute large-volume fluid shifts from the cerebral intracellular compartment to the extracellular compartment. He underwent urethral catheterization and produced 8 liters of urine over 24 hours. Serum sodium concentration remained stable after 24 hours of correction fluids. His GCS recovered to baseline after 48 hours with improvement in behavior -he engaged with healthcare professionals, understood the importance of taking medications, admitted to illicit drug use and drinking massive amounts of water. He was transferred from high-dependency care to ward level and was initiated on multiple trials of anti-epileptics before achieving seizure-free days two weeks after resolution of acute hyponatremia. Conclusion: Psychogenic polydipsia is often found in young patients with intellectual disability or psychiatric disorders. Patients drink large volumes of water daily ranging from ten to forty liters, resulting in acute severe hyponatremia with mortality rates as high as 20%. Poor outcomes are due to challenges faced by physicians in making an early diagnosis and treating acute hyponatremia safely. A low index of suspicion of water intoxication is required in this population, including patients with known epilepsy. Monitoring urine output proved to be clinically effective in aiding diagnosis. Early referral and admission to intensive care should be considered for safe correction of sodium concentration while minimizing risk of fatal complications e.g. central pontine myelinolysis.

Keywords: epilepsy, psychogenic polydipsia, seizure, severe hyponatremia

Procedia PDF Downloads 97
1 The Road Ahead: Merging Human Cyber Security Expertise with Generative AI

Authors: Brennan Lodge

Abstract:

Cybersecurity professionals have long been embroiled in a digital arms race, confronting increasingly sophisticated threats with innovative solutions. The field of cybersecurity is in an unending race against malicious adversaries. As threats evolve in complexity, the tools used to defend against them need to advance even faster. Burdened with a vast arsenal of tools and an expansive scope of threat intelligence, analysts frequently navigate a complex web, trying to discern patterns amidst information overload. Herein lies the potential of Retrieval Augmented Generation (RAG). By combining the capabilities of Large Language Models (LLMs) with a generative AI facet, RAG brings to the table an unparalleled ability for real-time cross-referencing, bridging the gap between raw data and actionable insights. Imagine an analyst named Sarah working at a global Fortune 500 company. Every day, Sarah navigates a maze of diverse knowledge bases, real-time threat intelligence, and her company's vast proprietary data, from network specifics to intricate technical blueprints. One day, she's challenged by a potential breach through a personal device due to the company's global "Bring Your Own Device" policy. With the clock ticking, Sarah has mere minutes to trace the malware's origin, all while considering complex regional regulations. As she races against the benchmark of Mean Time To Resolution (MTTR), she wonders: Could "Cozy Bear" with its notorious malware tactic, HAMMERTOSS, be behind this? Balancing policy intricacies, global network considerations, and ever-emerging cyber threats, Sarah's role epitomizes the intense challenges faced by today's cybersecurity analysts. While analysts grapple with this array of intricate, time-sensitive challenges, the necessity for precision and efficiency is key. RAG technology—a cutting-edge advancement in Gen AI—is a promising solution. Designed to assimilate diverse data sources such as cyber advisory notices, phishing email sentiment, secure and insecure code examples, information security policy documentation, and the MITRE ATT&CK framework, RAG equips analysts with real-time querying capabilities through a vector database and a cross referenced concise response from a Gen AI model. Traditional relational databases often necessitate a tedious process of filtering through numerous entries. Now, with the synergy of vector databases and Gen AI models, analysts can rapidly access both contextually or semantically akin data points. This augmented approach equips analysts with a comprehensive understanding of the prevailing cyber threats, elevating the robustness of cybersecurity defenses and upskilling the analyst and team, too. Vector databases underpin the knowledge translation in Gen AI. They bridge the gap between raw data and translation into meaningful insights, ensuring that analysts are equipped with comprehensive and relevant information. This superior capability of the RAG framework, with its impressive depth and precision, finds application across a broad spectrum of cybersecurity challenges. Let's delve into some use cases where its potential becomes particularly evident: Phishing Email Sentiment Analysis: Phishing remains a predominant vector for cybersecurity breaches. Leveraging RAG's capabilities, analysts can not only assess the potential malevolence of an email but can also understand the context behind it. By cross-referencing patterns from varied data sources in real-time, the detection process evolves from a mere content evaluation to a holistic understanding of attacker tactics, behaviors, and evolving profiles. This allows for the identification of nuanced phishing strategies that might otherwise go undetected. Insecure Code Analysis: Software vulnerabilities form a critical entry point for cyber adversaries. With RAG, the process of code evaluation undergoes a transformation. Instead of manual code reviews, the system pulls insights from vector databases and historical code snippets marked as insecure, enabling detection of vulnerabilities based on historical patterns, emerging threat vectors, and even predictive threat modeling. This ensures that even the most obfuscated or embedded vulnerabilities are identified, and corrective measures can be promptly implemented. Vulnerability and Upskill Advisory: In the fast-paced world of cybersecurity, staying updated is paramount. Through RAG's capabilities, analysts are not only made aware of real-time vulnerabilities but are also guided on the necessary skills and tools needed to combat them. By dynamically sourcing data through vulnerability advisories, news on advanced persistent threats, and tactics to defend, RAG ensures that analysts are not only reactive to threats but are also proactively upskilled, thereby bolstering their defense mechanisms. Information Security Policies for Compliance Teams: Compliance remains at the heart of many organizational cybersecurity strategies. However, with ever-shifting regulatory landscapes, staying compliant becomes a moving target. RAG's ability to source real-time data ensures that compliance teams always have access to the latest policy changes, guidelines, and best practices. This not only facilitates adherence to current standards but also anticipates future shifts, assists with audits, and ensures that organizations remain ahead of the compliance curve. Fusing a RAG architecture with platforms like Slack amplifies its practical utility. Slack, known for its real-time communication prowess, seamlessly evolves into more than just a messaging platform in this context. Cybersecurity analysts can pose intricate queries within Slack and, almost instantaneously, receive comprehensive feedback powered by the harmonious interplay of RAG and Gen AI. This integration effectively transforms Slack into an AI-augmented chatbot-like assistant for cybersecurity professionals, always ready to provide informed insights on-demand, making it an indispensable ally in the ever-evolving cyber battlefield. Navigating the vast landscape of cybersecurity, analysts often encounter unfamiliar terminologies and techniques., analysts require tools that not only detect or inform them of threats, like CISA (U.S Cybersecurity Infrastructure Security Agency) Advisories, but also interpret and communicate them effectively. Consider a junior cybersecurity analyst named Alex, who comes across the term "Kerberoasting" while reviewing a network log. Unfamiliar with its intricacies, Alex turns to Slack to pose a query: "chat explain is Kerberoasting, using CISA." Almost instantaneously, Slack, powered by the harmonious interplay of RAG and Gen AI, provides a detailed response, cross-referencing a recent cyber advisory on the technique. It explains how attackers can exploit the Kerberos Ticket Granting Service to decipher service account passwords, potentially compromising a network. In this dynamic realm of cybersecurity, the blend of RAG and Generative AI represents more than just a technological leap. It embodies a paradigm shift, promising a future where human expertise and AI-driven precision join forces. As cyber threats continue their relentless advance, this synergy ensures that defenders are equipped with an arsenal that's not just reactive, but also profoundly insightful. No longer should analysts be submerged in a deluge of data without direction. Instead, they should be empowered, to discern, act, and preempt with unparalleled clarity and confidence. By harmoniously intertwining human discernment with AI capabilities, we should chart a path towards a future where cybersecurity is not just about defense, but about achieving a strategic advantage, paving the way for a safer, informed and a more secure digital horizon.

Keywords: cybersecurity, gen AI, retrieval augmented generation, cybersecurity defense strategies

Procedia PDF Downloads 46