Search results for: David Silva Gutiérrez
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1164

Search results for: David Silva Gutiérrez

114 Preoperative Anxiety Evaluation: Comparing the Visual Facial Anxiety Scale/Yumul Faces Anxiety Scale, Numerical Verbal Rating Scale, Categorization Scale, and the State-Trait Anxiety Inventory

Authors: Roya Yumul, Chse, Ofelia Loani Elvir Lazo, David Chernobylsky, Omar Durra

Abstract:

Background: Preoperative anxiety has been shown to be caused by the fear associated with surgical and anesthetic complications; however, the current gold standard for assessing patient anxiety, the STAI, is problematic to use in the preoperative setting given the duration and concentration required to complete the 40-item extensive questionnaire. Our primary aim in the study is to investigate the correlation of the Visual Facial Anxiety Scale (VFAS) and Numerical Verbal Rating Scale (NVRS) to State-Trait Anxiety Inventory (STAI) to determine the optimal anxiety scale to use in the perioperative setting. Methods: A clinical study of patients undergoing various surgeries was conducted utilizing each of the preoperative anxiety scales. Inclusion criteria included patients undergoing elective surgeries, while exclusion criteria included patients with anesthesia contraindications, inability to comprehend instructions, impaired judgement, substance abuse history, and those pregnant or lactating. 293 patients were analyzed in terms of demographics, anxiety scale survey results, and anesthesia data via Spearman Coefficients, Chi-Squared Analysis, and Fischer’s exact test utilized for comparison analysis. Results: Statistical analysis showed that VFAS had a higher correlation to STAI than NVRS (rs=0.66, p<0.0001 vs. rs=0.64, p<0.0001). The combined VFAS-Categorization Scores showed the highest correlation with the gold standard (rs=0.72, p<0.0001). Subgroup analysis showed similar results. STAI evaluation time (247.7 ± 54.81 sec) far exceeds VFAS (7.29 ± 1.61 sec), NVRS (7.23 ± 1.60 sec), and Categorization scales (7.29 ± 1.99 sec). Patients preferred VFAS (54.4%), Categorization (11.6%), and NVRS (8.8%). Anesthesiologists preferred VFAS (63.9%), NVRS (22.1%), and Categorization Scales (14.0%). Of note, the top five causes of preoperative anxiety were determined to be waiting (56.5%), pain (42.5%), family concerns (40.5%), no information about surgery (40.1%), or anesthesia (31.6%). Conclusions: Combined VFAS-Categorization Score (VCS) demonstrates the highest correlation to the gold standard, STAI. Both VFAS and Categorization tests also take significantly less time than STAI, which is critical in the preoperative setting. Among both patients and anesthesiologists, VFAS was the most preferred scale. This forms the basis of the Yumul FACES Anxiety Scale, designed for quick quantization and assessment in the preoperative setting while maintaining a high correlation to the golden standard. Additional studies using the formulated Yumul FACES Anxiety Scale are merited.

Keywords: numerical verbal anxiety scale, preoperative anxiety, state-trait anxiety inventory, visual facial anxiety scale

Procedia PDF Downloads 112
113 Learning from Long COVID: How Healthcare Needs to Change for Contested Illnesses

Authors: David Tennison

Abstract:

In the wake of the Covid-19 pandemic, a new chronic illness emerged onto the global stage: Long Covid. Long Covid presents with several symptoms commonly seen in other poorly-understood illnesses, such as fibromyalgia (FM) and myalgic encephalomyelitis/ chronic fatigue syndrome (ME/CFS). However, while Long Covid has swiftly become a recognised illness, FM and ME/CFS are still seen as contested, which impacts patient care and healthcare experiences. This study aims to examine what the differences are between Long Covid and FM; and if the Long Covid case can provide guidance for how to address the healthcare challenge of contested illnesses. To address this question, this study performed comprehensive research into the history of FM; our current biomedical understanding of it; and available healthcare interventions (within the context of the UK NHS). Analysis was undertaken of the stigma and stereotypes around FM, and a comparison made between FM and the emerging Long Covid literature, along with the healthcare response to Long Covid. This study finds that healthcare for chronic contested illnesses in the UK is vastly insufficient - in terms of pharmaceutical and holistic interventions, and the provision of secondary care options. Interestingly, for Long Covid, many of the treatment suggestions are pulled directly from those used for contested illnesses. The key difference is in terms of funding and momentum – Long Covid has generated exponentially more interest and research in a short time than there has been in the last few decades of contested illness research. This stands to help people with FM and ME/CFS – for example, research has recently been funded into “brain fog”, a previously elusive and misunderstood symptom. FM is culturally regarded as a “women’s disease” and FM stigma stems from notions of “hysteria”. A key finding is that the idea of FM affecting women disproportionally is not reflected in modern population studies. Emerging data on Long Covid also suggests a slight leaning towards more female patients, however it is less feminised, potentially due to it emerging in the global historical moment of the pandemic. Another key difference is that FM is rated as an extremely low-prestige illness by healthcare professionals, while it was in large part due to the advocacy of affected healthcare professionals that Long Covid was so quickly recognised by science and medicine. In conclusion, Long Covid (and the risk of future pandemics and post-viral illnesses) highlight a crucial need for implementing new, and reinforcing existing, care networks for chronic illnesses. The difference in how contested illnesses like FM, and new ones like Long Covid are treated have a lot to do with the historical moment in which they emerge – but cultural stereotypes, from within and without medicine, need updating. Particularly as they contribute to disease stigma that causes genuine harm to patients. However, widespread understanding and acceptance of Long Covid could help fight contested illness stigma, and the attention, funding and research into Long Covid may actually help raise the profile of contested illnesses and uncover answers about their symptomatology.

Keywords: long COVID, fibromyalgia, myalgic encephalomyelitis, chronic fatigue syndrome, NHS, healthcare, contested illnesses, chronic illnesses, COVID-19 pandemic

Procedia PDF Downloads 49
112 Drug Delivery Cationic Nano-Containers Based on Pseudo-Proteins

Authors: Sophio Kobauri, Temur Kantaria, Nina Kulikova, David Tugushi, Ramaz Katsarava

Abstract:

The elaboration of effective drug delivery vehicles is still topical nowadays since targeted drug delivery is one of the most important challenges of the modern nanomedicine. The last decade has witnessed enormous research focused on synthetic cationic polymers (CPs) due to their flexible properties, in particular as non-viral gene delivery systems, facile synthesis, robustness, not oncogenic and proven gene delivery efficiency. However, the toxicity is still an obstacle to the application in pharmacotherapy. For overcoming the problem, creation of new cationic compounds including the polymeric nano-size particles – nano-containers (NCs) loading with different pharmaceuticals and biologicals is still relevant. In this regard, a variety of NCs-based drug delivery systems have been developed. We have found that amino acid-based biodegradable polymers called as pseudo-proteins (PPs), which can be cleared from the body after the fulfillment of their function are highly suitable for designing pharmaceutical NCs. Among them, one of the most promising are NCs made of biodegradable Cationic PPs (CPPs). For preparing new cationic NCs (CNCs), we used CPPs composed of positively charged amino acid L-arginine (R). The CNCs were fabricated by two approaches using: (1) R-based homo-CPPs; (2) Blends of R-based CPPs with regular (neutral) PPs. According to the first approach NCs we prepared from CPPs 8R3 (composed of R, sebacic acid and 1,3-propanediol) and 8R6 (composed of R, sebacic acid and 1,6-hexanediol). The NCs prepared from these CPPs were 72-101 nm in size with zeta potential within +30 ÷ +35 mV at a concentration 6 mg/mL. According to the second approach, CPPs 8R6 was blended in organic phase with neutral PPs 8L6 (composed of leucine, sebacic acid and 1,6-hexanediol). The NCs prepared from the blends were 130-140 nm in size with zeta potential within +20 ÷ +28 mV depending on 8R6/8L6 ratio. The stability studies of fabricated NCs showed that no substantial change of the particle size and distribution and no big particles’ formation is observed after three months storage. In vitro biocompatibility study of the obtained NPs with four different stable cell lines: A549 (human), U-937 (human), RAW264.7 (murine), Hepa 1-6 (murine) showed both type cathionic NCs are biocompatible. The obtained data allow concluding that the obtained CNCs are promising for the application as biodegradable drug delivery vehicles. This work was supported by the joint grant from the Science and Technology Center in Ukraine and Shota Rustaveli National Science Foundation of Georgia #6298 'New biodegradable cationic polymers composed of arginine and spermine-versatile biomaterials for various biomedical applications'.

Keywords: biodegradable polymers, cationic pseudo-proteins, nano-containers, drug delivery vehicles

Procedia PDF Downloads 132
111 Smart and Active Package Integrating Printed Electronics

Authors: Joana Pimenta, Lorena Coelho, José Silva, Vanessa Miranda, Jorge Laranjeira, Rui Soares

Abstract:

In this paper, the results of R&D on an innovative food package for increased shelf-life are presented. SAP4MA aims at the development of a printed active device that enables smart packaging solutions for food preservation, targeting the extension of the shelf-life of the packed food through the controlled release of active natural antioxidant agents at the onset of the food degradation process. To do so, SAP4MA focuses on the development of active devices such as printed heaters and batteries/supercapacitors in a label format to be integrated on packaging lids during its injection molding process, promoting the passive release of natural antioxidants after the product is packed, during transportation and in the shelves, and actively when the end-user activates the package, just prior to consuming the product at home. When the active device present on the lid is activated, the release of the natural antioxidants embedded in the inner layer of the packaging lid in direct contact with the headspace atmosphere of the food package starts. This approach is based on the use of active functional coatings composed of nano encapsulated active agents (natural antioxidants species) in the prevention of the oxidation of lipid compounds in food by agents such as oxygen. Thus keeping the product quality during the shelf-life, not only when the user opens the packaging, but also during the period from food packaging up until the purchase by the consumer. The active systems that make up the printed smart label, heating circuit, and battery were developed using screen-printing technology. These systems must operate under the working conditions associated with this application. The printed heating circuit was studied using three different substrates and two different conductive inks. Inks were selected, taking into consideration that the printed circuits will be subjected to high pressures and temperatures during the injection molding process. The circuit must reach a homogeneous temperature of 40ºC in the entire area of the lid of the food tub, promoting a gradual and controlled release of the antioxidant agents. In addition, the circuit design involves a high level of study in order to guarantee maximum performance after the injection process and meet the specifications required by the control electronics component. Furthermore, to characterize the different heating circuits, the electrical resistance promoted by the conductive ink and the circuit design, as well as the thermal behavior of printed circuits on different substrates, were evaluated. In the injection molding process, the serpentine-shaped design developed for the heating circuit was able to resolve the issues connected to the injection point; in addition, the materials used in the support and printing had high mechanical resistance against the pressure and temperature inherent to the injection process. Acknowledgment: This research has been carried out within the Project “Smart and Active Packing for Margarine Product” (SAP4MA) running under the EURIPIDES Program being co-financed by COMPETE 2020 – the Operational Programme for Competitiveness and Internationalization and under Portugal 2020 through the European Regional Development Fund (ERDF).

Keywords: smart package, printed heat circuits, printed batteries, flexible and printed electronic

Procedia PDF Downloads 85
110 Insect Manure (Frass) as a Complementary Fertilizer to Enhance Soil Mineralization Function: Application to Cranberry and Field Crops

Authors: Joël Passicousset, David Gilbert, Chloé Chervier-Legourd, Emmanuel Caron-Garant, Didier Labarre

Abstract:

Living soil agriculture tries to reconciliate food production while improving soil health, soil biodiversity, soil fertility and more generally attenuating the inherent environmental drawbacks induced by modern agriculture. Using appropriate organic materials as soil amendments has a role to play in the aim of increasing the soil organic matter, improving soil fertility, sequestering carbon, and diminishing the dependence on both mineral fertilizer and pesticides. Insect farming consists in producing insects that can be used as a rich-in-protein and entomo-based food. Usually, detritivores are chosen, thus they can be fed with food wastes, which contributes to circular economy while producing low-carbon food. This process also produces frass, made of insect feces, exuvial material, and non-digested fibrous material, that have valuable fertilizer and biostimulation properties. But frass, used as a sole fertilizer on a crop may be not completely adequate for plants’ needs. This is why this project considers black soldier fly (termed BSF, one of the three main insect species grown commercially) frass as a complementary fertilizer, both in organic and in conventional contexts. Three kinds of experiments are made to understand the behaviour of fertilizer treatments based on frass incorporation. Lab-scale mineralization experiments suggest that BSF frass alone mineralizes more slowly than chicken manure alone (CM), but at a ratio of 90% CM-10% BSF frass, the mineralization rate of the mixture is higher than both frass and CM individually. For example, in the 7 days following the fertilization with same nitrogen amount introduced among treatments, around 80% of the nitrogen content supplied through 90% CM-10% BSF frass fertilization is present in the soil under mineral forms, compared to roughly 60% for commercial CM fertilization and 45% with BSF-frass. This suggests that BSF frass contains a more recalcitrant form of organic nitrogen than CM, but also that BSF frass has a highly active microbiota that can increase CM mineralization rate. Consequently, when progressive mineralization is needed, pure BSF-frass may be a consistent option from an agronomic aspect whereas, for specific crops that require spikes of readily available nitrogen sources (like cranberry), fast release 90CM-10BSF frass biofertilizer are more appropriate. Field experiments on cranberry suggests that, indeed, 90CM-10BSF frass is a potent candidate for organic cranberry production, as currently, organic growers rely solely on CM, whose mineralization kinetics are known to imperfectly match plant’s needs, which is known to be a major reason that sustains the current yield gap between conventional and organic cranberry sectors.

Keywords: soil mineralization, biofertilizer, BSF-frass, chicken manure, soil functions, nitrogen, soil microbiota

Procedia PDF Downloads 42
109 Quantum Dots Incorporated in Biomembrane Models for Cancer Marker

Authors: Thiago E. Goto, Carla C. Lopes, Helena B. Nader, Anielle C. A. Silva, Noelio O. Dantas, José R. Siqueira Jr., Luciano Caseli

Abstract:

Quantum dots (QD) are semiconductor nanocrystals that can be employed in biological research as a tool for fluorescence imagings, having the potential to expand in vivo and in vitro analysis as cancerous cell biomarkers. Particularly, cadmium selenide (CdSe) magic-sized quantum dots (MSQDs) exhibit stable luminescence that is feasible for biological applications, especially for imaging of tumor cells. For these facts, it is interesting to know the mechanisms of action of how such QDs mark biological cells. For that, simplified models are a suitable strategy. Among these models, Langmuir films of lipids formed at the air-water interface seem to be adequate since they can mimic half a membrane. They are monomolecular films formed at liquid-gas interfaces that can spontaneously form when organic solutions of amphiphilic compounds are spread on the liquid-gas interface. After solvent evaporation, the monomolecular film is formed, and a variety of techniques, including tensiometric, spectroscopic and optic can be applied. When the monolayer is formed by membrane lipids at the air-water interface, a model for half a membrane can be inferred where the aqueous subphase serve as a model for external or internal compartment of the cell. These films can be transferred to solid supports forming the so-called Langmuir-Blodgett (LB) films, and an ampler variety of techniques can be additionally used to characterize the film, allowing for the formation of devices and sensors. With these ideas in mind, the objective of this work was to investigate the specific interactions of CdSe MSQDs with tumorigenic and non-tumorigenic cells using Langmuir monolayers and LB films of lipids and specific cell extracts as membrane models for diagnosis of cancerous cells. Surface pressure-area isotherms and polarization modulation reflection-absorption spectroscopy (PM-IRRAS) showed an intrinsic interaction between the quantum dots, inserted in the aqueous subphase, and Langmuir monolayers, constructed either of selected lipids or of non-tumorigenic and tumorigenic cells extracts. The quantum dots expanded the monolayers and changed the PM-IRRAS spectra for the lipid monolayers. The mixed films were then compressed to high surface pressures and transferred from the floating monolayer to solid supports by using the LB technique. Images of the films were then obtained with atomic force microscopy (AFM) and confocal microscopy, which provided information about the morphology of the films. Similarities and differences between films with different composition representing cell membranes, with or without CdSe MSQDs, was analyzed. The results indicated that the interaction of quantum dots with the bioinspired films is modulated by the lipid composition. The properties of the normal cell monolayer were not significantly altered, whereas for the tumorigenic cell monolayer models, the films presented significant alteration. The images therefore exhibited a stronger effect of CdSe MSQDs on the models representing cancerous cells. As important implication of these findings, one may envisage for new bioinspired surfaces based on molecular recognition for biomedical applications.

Keywords: biomembrane, langmuir monolayers, quantum dots, surfaces

Procedia PDF Downloads 173
108 Comparison of the Yumul Faces Anxiety Scale to the Categorization Scale, the Numerical Verbal Rating Scale, and the State-Trait Anxiety Inventory for Preoperative Anxiety Evaluation

Authors: Ofelia Loani Elvir Lazo, Roya Yumul, David Chernobylsky, Omar Durra

Abstract:

Background: It is crucial to detect the patient’s existing anxiety to assist patients in a perioperative setting which is to be caused by the fear associated with surgical and anesthetic complications. However, the current gold standard for assessing patient anxiety, the STAI, is problematic to use in the preoperative setting, given the duration and concentration required to complete the 40-item questionnaire. Our primary aim in the study is to investigate the correlation of the Yumul Visual Facial Anxiety Scale (VFAS) and Numerical Verbal Rating Scale (NVRS) to State-Trait Anxiety Inventory (STAI) to determine the optimal anxiety scale to use in the perioperative setting. Methods: A clinical study of patients undergoing various surgeries was conducted utilizing each of the preoperative anxiety scales. Inclusion criteria included patients undergoing elective surgeries, while exclusion criteria included patients with anesthesia contraindications, inability to comprehend instructions, impaired judgement, substance abuse history, and those pregnant or lactating. 293 patients were analyzed in terms of demographics, anxiety scale survey results, and anesthesia data via Spearman Coefficients, Chi-Squared Analysis, and Fischer’s exact test utilized for comparative analysis. Results: Statistical analysis showed that VFAS had a higher correlation to STAI than NVRS (rs=0.66, p<0.0001 vs. rs=0.64, p<0.0001). The combined VFAS-Categorization Scores showed the highest correlation with the gold standard (rs=0.72, p<0.0001). Subgroup analysis showed similar results. STAI evaluation time (247.7 ± 54.81 sec) far exceeds VFAS (7.29 ± 1.61 sec), NVRS (7.23 ± 1.60 sec), and Categorization scales (7.29 ± 1.99 sec). Patients preferred VFAS (54.4%), Categorization (11.6%), and NVRS (8.8%). Anesthesiologists preferred VFAS (63.9%), NVRS (22.1%), and Categorization Scales (14.0%). Of note, the top five causes of preoperative anxiety were determined to be waiting (56.5%), pain (42.5%), family concerns (40.5%), no information about surgery (40.1%), or anesthesia (31.6%). Conclusıons: Both VFAS and Categorization tests also take significantly less time than STAI, which is critical in the preoperative setting. Combined VFAS-Categorization Score (VCS) demonstrates the highest correlation to the gold standard, STAI. Among both patients and anesthesiologists, VFAS was the most preferred scale. This forms the basis of the Yumul Faces Anxiety Scale, designed for quick quantization and assessment in the preoperative setting while maintaining a high correlation to the golden standard. Additional studies using the formulated Yumul Faces Anxiety Scale are merited.

Keywords: numerical verbal anxiety scale, preoperative anxiety, state-trait anxiety inventory, visual facial anxiety scale

Procedia PDF Downloads 93
107 Design of a Low-Cost, Portable, Sensor Device for Longitudinal, At-Home Analysis of Gait and Balance

Authors: Claudia Norambuena, Myissa Weiss, Maria Ruiz Maya, Matthew Straley, Elijah Hammond, Benjamin Chesebrough, David Grow

Abstract:

The purpose of this project is to develop a low-cost, portable sensor device that can be used at home for long-term analysis of gait and balance abnormalities. One area of particular concern involves the asymmetries in movement and balance that can accompany certain types of injuries and/or the associated devices used in the repair and rehabilitation process (e.g. the use of splints and casts) which can often increase chances of falls and additional injuries. This device has the capacity to monitor a patient during the rehabilitation process after injury or operation, increasing the patient’s access to healthcare while decreasing the number of visits to the patient’s clinician. The sensor device may thereby improve the quality of the patient’s care, particularly in rural areas where access to the clinician could be limited, while simultaneously decreasing the overall cost associated with the patient’s care. The device consists of nine interconnected accelerometer/ gyroscope/compass chips (9-DOF IMU, Adafruit, New York, NY). The sensors attach to and are used to determine the orientation and acceleration of the patient’s lower abdomen, C7 vertebra (lower neck), L1 vertebra (middle back), anterior side of each thigh and tibia, and dorsal side of each foot. In addition, pressure sensors are embedded in shoe inserts with one sensor (ESS301, Tekscan, Boston, MA) beneath the heel and three sensors (Interlink 402, Interlink Electronics, Westlake Village, CA) beneath the metatarsal bones of each foot. These sensors measure the distribution of the weight applied to each foot as well as stride duration. A small microntroller (Arduino Mega, Arduino, Ivrea, Italy) is used to collect data from these sensors in a CSV file. MATLAB is then used to analyze the data and output the hip, knee, ankle, and trunk angles projected on the sagittal plane. An open-source program Processing is then used to generate an animation of the patient’s gait. The accuracy of the sensors was validated through comparison to goniometric measurements (±2° error). The sensor device was also shown to have sufficient sensitivity to observe various gait abnormalities. Several patients used the sensor device, and the data collected from each represented the patient’s movements. Further, the sensors were found to have the ability to observe gait abnormalities caused by the addition of a small amount of weight (4.5 - 9.1 kg) to one side of the patient. The user-friendly interface and portability of the sensor device will help to construct a bridge between patients and their clinicians with fewer necessary inpatient visits.

Keywords: biomedical sensing, gait analysis, outpatient, rehabilitation

Procedia PDF Downloads 259
106 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow

Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat

Abstract:

Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.

Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement

Procedia PDF Downloads 72
105 Neural Synchronization - The Brain’s Transfer of Sensory Data

Authors: David Edgar

Abstract:

To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.

Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)

Procedia PDF Downloads 105
104 Optimization of Operational Water Quality Parameters in a Drinking Water Distribution System Using Response Surface Methodology

Authors: Sina Moradi, Christopher W. K. Chow, John Van Leeuwen, David Cook, Mary Drikas, Patrick Hayde, Rose Amal

Abstract:

Chloramine is commonly used as a disinfectant in drinking water distribution systems (DWDSs), particularly in Australia and the USA. Maintaining a chloramine residual throughout the DWDS is important in ensuring microbiologically safe water is supplied at the customer’s tap. In order to simulate how chloramine behaves when it moves through the distribution system, a water quality network model (WQNM) can be applied. In this work, the WQNM was based on mono-chloramine decomposition reactions, which enabled prediction of mono-chloramine residual at different locations through a DWDS in Australia, using the Bentley commercial hydraulic package (Water GEMS). The accuracy of WQNM predictions is influenced by a number of water quality parameters. Optimization of these parameters in order to obtain the closest results in comparison with actual measured data in a real DWDS would result in both cost reduction as well as reduction in consumption of valuable resources such as energy and materials. In this work, the optimum operating conditions of water quality parameters (i.e. temperature, pH, and initial mono-chloramine concentration) to maximize the accuracy of mono-chloramine residual predictions for two water supply scenarios in an entire network were determined using response surface methodology (RSM). To obtain feasible and economical water quality parameters for highest model predictability, Design Expert 8.0 software (Stat-Ease, Inc.) was applied to conduct the optimization of three independent water quality parameters. High and low levels of the water quality parameters were considered, inevitably, as explicit constraints, in order to avoid extrapolation. The independent variables were pH, temperature and initial mono-chloramine concentration. The lower and upper limits of each variable for two water supply scenarios were defined and the experimental levels for each variable were selected based on the actual conditions in studied DWDS. It was found that at pH of 7.75, temperature of 34.16 ºC, and initial mono-chloramine concentration of 3.89 (mg/L) during peak water supply patterns, root mean square error (RMSE) of WQNM for the whole network would be minimized to 0.189, and the optimum conditions for averaged water supply occurred at pH of 7.71, temperature of 18.12 ºC, and initial mono-chloramine concentration of 4.60 (mg/L). The proposed methodology to predict mono-chloramine residual can have a great potential for water treatment plant operators in accurately estimating the mono-chloramine residual through a water distribution network. Additional studies from other water distribution systems are warranted to confirm the applicability of the proposed methodology for other water samples.

Keywords: chloramine decay, modelling, response surface methodology, water quality parameters

Procedia PDF Downloads 204
103 Evolutionary Advantages of Loneliness with an Agent-Based Model

Authors: David Gottlieb, Jason Yoder

Abstract:

The feeling of loneliness is not uncommon in modern society, and yet, there is a fundamental lack of understanding in its origins and purpose in nature. One interpretation of loneliness is that it is a subjective experience that punishes a lack of social behavior, and thus its emergence in human evolution is seemingly tied to the survival of early human tribes. Still, a common counterintuitive response to loneliness is a state of hypervigilance, resulting in social withdrawal, which may appear maladaptive to modern society. So far, no computational model of loneliness’ effect during evolution yet exists; however, agent-based models (ABM) can be used to investigate social behavior, and applying evolution to agents’ behaviors can demonstrate selective advantages for particular behaviors. We propose an ABM where each agent contains four social behaviors, and one goal-seeking behavior, letting evolution select the best behavioral patterns for resource allocation. In our paper, we use an algorithm similar to the boid model to guide the behavior of agents, but expand the set of rules that govern their behavior. While we use cohesion, separation, and alignment for simple social movement, our expanded model adds goal-oriented behavior, which is inspired by particle swarm optimization, such that agents move relative to their personal best position. Since agents are given the ability to form connections by interacting with each other, our final behavior guides agent movement toward its social connections. Finally, we introduce a mechanism to represent a state of loneliness, which engages when an agent's perceived social involvement does not meet its expected social involvement. This enables us to investigate a minimal model of loneliness, and using evolution we attempt to elucidate its value in human survival. Agents are placed in an environment in which they must acquire resources, as their fitness is based on the total resource collected. With these rules in place, we are able to run evolution under various conditions, including resource-rich environments, and when disease is present. Our simulations indicate that there is strong selection pressure for social behavior under circumstances where there is a clear discrepancy between initial resource locations, and against social behavior when disease is present, mirroring hypervigilance. This not only provides an explanation for the emergence of loneliness, but also reflects the diversity of response to loneliness in the real world. In addition, there is evidence of a richness of social behavior when loneliness was present. By introducing just two resource locations, we observed a divergence in social motivation after agents became lonely, where one agent learned to move to the other, who was in a better resource position. The results and ongoing work from this project show that it is possible to glean insight into the evolutionary advantages of even simple mechanisms of loneliness. The model we developed has produced unexpected results and has led to more questions, such as the impact loneliness would have at a larger scale, or the effect of creating a set of rules governing interaction beyond adjacency.

Keywords: agent-based, behavior, evolution, loneliness, social

Procedia PDF Downloads 74
102 Somatic Delusional Disorder Subsequent to Phantogeusia: A Case Report

Authors: Pedro Felgueiras, Ana Miguel, Nélson Almeida, Raquel Silva

Abstract:

Objective: Through the study of a clinical case of delusional somatic disorder secondary to phantogeusia, we aim to highlight the importance of considering psychosomatic conditions in differential diagnosis, as well as to emphasize the complexity of its comprehension, treatment, and respective impact on patients’ functioning. Methods: Bearing this in mind, we conducted a critical analysis of a case series based on patient observations, clinical data, and complementary diagnostic methods, as well as a non-systematic review of the literature on the subject. Results: A 61-year-old female patient with no history of psychiatric conditions. Family psychiatric history of mood disorder (depression), with psychotic features found in her mother. Medical history of many comorbidities affecting different organ systems (endocrine, gastrointestinal, genitourinary, ophthalmological). Documented neuroticism traits of personality. The patient’s family described a persistent concern about several physical symptoms across her life, with a continuous effort to obtain explanations about any sensation out of her normal perception. Since being subjected to endoscopy in 2018, she started complaints of persistent phantogeusia (acid taste) and developed excessive thoughts, feelings, and behaviors associated with this somatic symptom. The patient was evaluated by several medical specialties, and an extensive panel of medical exams was carried out, excluding any disease. Besides all the investigation and with no evidence of disease signs, acute anxiety, time, and energy dispended to this symptom culminated in severe psychosocial impairment. The patient was admitted to a psychiatric ward for investigation and treatment of this clinical picture, leading to the diagnosis of the delusional somatic disorder. In order to exclude the acute organic etiology of this psychotic disorder, an analytic panel was carried out with no abnormal results. In the context of a psychotic clinical picture, a CT scan was performed, which revealed a right cortical vascular lesion. Neuropsychological evaluation was made, with the description of cognitive functioning being globally normative. During treatment with an antipsychotic (pimozide), a complete remission of the somatic delusion was associated with the disappearance of gustative perception disturbance. In follow-up, a relapse of gustative sensation was documented, and her thoughts and speech were dominated by concerns about multiple somatic symptoms. Conclusion: In terms of abnormal bodily sensations, the oral cavity is one of the frequent sites of delusional disorder. Patients with these gustatory perception distortions complain about unusual sensations without corresponding abnormal findings in the oral area. Its pathophysiology has not been fully elucidated yet. In terms of its comprehensive psychopathology, this case was hypothesized as a paranoid development of a delusional somatic disorder triggered by a post-invasive procedure phantogeusia (which is described as a possible side effect of an endoscopy) in a patient with an anankastic personality. This case presents interesting psychopathology, reinforcing the complexity of psychosomatic disorders in terms of their etiopathogenesis, clinical treatment, and long-term prognosis.

Keywords: psychosomatics, delusional somatic disorder, phantogeusia, paranoid development

Procedia PDF Downloads 102
101 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data

Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau

Abstract:

Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.

Keywords: calcium imaging, computer vision, neural activity, neural networks

Procedia PDF Downloads 64
100 Structural Fluxionality of Luminescent Coordination Compounds with Lanthanide Ions

Authors: Juliana A. B. Silva, Caio H. T. L. Albuquerque, Leonardo L. dos Santos, Cristiane K. Oliveira, Ivani Malvestiti, Fernando Hallwass, Ricardo L. Longo

Abstract:

Complexes with lanthanide ions have been extensively studied due to their applications as luminescent, magnetic and catalytic materials as molecular or extended crystals, thin films, glasses, polymeric matrices, ionic liquids, and in solution. NMR chemical shift data in solution have been reported and suggest fluxional structures in a wide range of coordination compounds with rare earth ions. However, the fluxional mechanisms for these compounds are still not established. This structural fluxionality may affect the photophysical, catalytic and magnetic properties in solution. Thus, understanding the structural interconversion mechanisms may aid the design of coordination compounds with, for instance, improved (electro)luminescence, catalytic and magnetic behaviors. The [Eu(btfa)₃bipy] complex, where btfa= 4,4,4-trifluoro-1-phenyl-1,3-butanedionate and bipy= 2,2’-bipiridyl, has a well-defined X-ray crystallographic structure and preliminary 1H NMR data suggested a structural fluxionality. Thus, we have investigated a series of coordination compounds with lanthanide ions [Ln(btfa)₃L], where Ln = La, Eu, Gd or Yb and L= bipy or phen (phen=1,10-phenanthroline) using a combined theoretical-experimental approach. These complexes were synthesized and fully characterized, and detailed NMR measurements were obtained. They were also studied by quantum chemical computational methods (DFT-PBE0). The aim was to determine the relevant factors in the structure of these compounds that favor or not the fluxional behavior. Measurements of the 1H NMR signals at variable temperature in CD₂Cl₂ of the [Eu(btfa)₃L] complexes suggest that these compounds have a fluxional structure, because the crystal structure has non-equivalent btfa ligands that should lead to non-equivalent hydrogen atoms and thus to more signals in the NMR spectra than those obtained at room temperature, where all hydrogen atoms of the btfa ligands are equivalent, and phen ligand has an effective vertical symmetry plane. For the [Eu(btfa)₃bipy] complex, the broadening of the signals at –70°C provides a lower bound for the coalescence temperature, which indicates the energy barriers involved in the structural interconversion mechanisms are quite small. These barriers and, consequently, the coalescence temperature are dependent upon the radii of the lanthanide ion as well as to their paramagnetic effects. The PBE0 calculated structures are in very good agreement with the crystallographic data and, for the [Eu(btfa)₃bipy] complex, this method provided several distinct structures with almost the same energy. However, the energy barrier for structural interconversion via dissociative pathways were found to be quite high and could not explain the experimental observations. Whereas the pseudo-rotation pathways, involving the btfa and bipy ligands, have very small activation barriers, in excellent agreement with the NMR data. The results also showed an increase in the activation barrier along the lanthanide series due to the decrease of the ionic radii and consequent increase of the steric effects. TD-DFT calculations showed a dependence of the ligand donor state energy with different structures of the complex [Eu(btfa)₃phen], which can affect the energy transfer rates and the luminescence. The energy required to promote the structural fluxionality may also enhance the luminescence quenching in solution. These results can aid in the design of more luminescent compounds and more efficient devices.

Keywords: computational chemistry, lanthanide-based compounds, NMR, structural fluxionality

Procedia PDF Downloads 176
99 Violent, Psychological, Sexual and Abuse-Related Emergency Department Usage amongst Pediatric Victims of Physical Assault and Gun Violence: A Case-Control Study

Authors: Mary Elizabeth Bernardin, Margie Batek, Joseph Moen, David Schnadower

Abstract:

Background: Injuries due to interpersonal violence are a common reason for emergency department (ED) visits amongst the American pediatric population. Gun violence, in particular, is associated with high morbidity, mortality as well as financial costs. Patterns of pediatric ED usage may be an indicator of risk for future violence, but very little data on the topic exists. Objective: The aims of this study were to assess for frequencies of ED usage for previous interpersonal violence, mental/behavioral issues, sexual/reproductive issues and concerns for abuse in youths presenting to EDs due to physical assault injuries (PAIs) compared to firearm injuries (FIs). Methods: In this retrospective case-control study, ED charts of children ages 8-19 years who presented with injuries due to interpersonal violent encounters from 2014-2017 were reviewed. Data was collected regarding all previous ED visits for injuries due to interpersonal violence (including physical assaults and firearm injuries), mental/behavioral health visits (including depression, suicidal ideation, suicide attempt, homicidal ideation and violent behavior), sexual/reproductive health visits (including sexually transmitted infections and pregnancy related issues), and concerns for abuse (including physical abuse or domestic violence, neglect, sexual abuse, sexual assault, and intimate partner violence). Logistic regression was used to identify predictors of gun violence based on previous ED visits amongst physical assault injured versus firearm injured youths. Results: A total of 407 patients presenting to the ED for an interpersonal violent encounter were analyzed, 251 (62%) of which were due to physical assault injuries (PAIs) and 156 (38%) due to firearm injuries (FIs). The majority of both PAI and FI patients had no previous history of ED visits for violence, mental/behavioral health, sexual/reproductive health or concern for abuse (60.8% PAI, 76.3% FI). 19.2% of PAI and 13.5% of FI youths had previous ED visits for physical assault injuries (OR 0.68, P=0.24, 95% CI 0.36 to 1.29). 1.6% of PAI and 3.2% of FI youths had a history of ED visits for previous firearm injuries (OR 3.6, P=0.34, 95% CI 0.04 to 2.95). 10% of PAI and 3.8% of FI youths had previous ED visits for mental/behavioral health issues (OR 0.91, P=0.80, 95% CI 0.43 to 1.93). 10% of PAI and 2.6% of FI youths had previous ED visits due to concerns for abuse (OR 0.76, P=0.55, 95% CI 0.31 to 1.86). Conclusions: There are no statistically significant differences between physical assault-injured and firearm-injured youths in terms of ED usage for previous violent injuries, mental/behavioral health visits, sexual/reproductive health visits or concerns for abuse. However, violently injured youths in this study have more than twice the number of previous ED usage for physical assaults and mental health visits than previous literature indicates. Data comparing ED usage of victims of interpersonal violence to nonviolent ED patients is needed, but this study supports the notion that EDs may be a useful place for identification of and enrollment in interventions for youths most at risk for future violence.

Keywords: child abuse, emergency department usage, pediatric gun violence, pediatric interpersonal violence, pediatric mental health, pediatric reproductive health

Procedia PDF Downloads 211
98 Metacognitive Processing in Early Readers: The Role of Metacognition in Monitoring Linguistic and Non-Linguistic Performance and Regulating Students' Learning

Authors: Ioanna Taouki, Marie Lallier, David Soto

Abstract:

Metacognition refers to the capacity to reflect upon our own cognitive processes. Although there is an ongoing discussion in the literature on the role of metacognition in learning and academic achievement, little is known about its neurodevelopmental trajectories in early childhood, when children begin to receive formal education in reading. Here, we evaluate the metacognitive ability, estimated under a recently developed Signal Detection Theory model, of a cohort of children aged between 6 and 7 (N=60), who performed three two-alternative-forced-choice tasks (two linguistic: lexical decision task, visual attention span task, and one non-linguistic: emotion recognition task) including trial-by-trial confidence judgements. Our study has three aims. First, we investigated how metacognitive ability (i.e., how confidence ratings track accuracy in the task) relates to performance in general standardized tasks related to students' reading and general cognitive abilities using Spearman's and Bayesian correlation analysis. Second, we assessed whether or not young children recruit common mechanisms supporting metacognition across the different task domains or whether there is evidence for domain-specific metacognition at this early stage of development. This was done by examining correlations in metacognitive measures across different task domains and evaluating cross-task covariance by applying a hierarchical Bayesian model. Third, using robust linear regression and Bayesian regression models, we assessed whether metacognitive ability in this early stage is related to the longitudinal learning of children in a linguistic and a non-linguistic task. Notably, we did not observe any association between students’ reading skills and metacognitive processing in this early stage of reading acquisition. Some evidence consistent with domain-general metacognition was found, with significant positive correlations between metacognitive efficiency between lexical and emotion recognition tasks and substantial covariance indicated by the Bayesian model. However, no reliable correlations were found between metacognitive performance in the visual attention span and the remaining tasks. Remarkably, metacognitive ability significantly predicted children's learning in linguistic and non-linguistic domains a year later. These results suggest that metacognitive skill may be dissociated to some extent from general (i.e., language and attention) abilities and further stress the importance of creating educational programs that foster students’ metacognitive ability as a tool for long term learning. More research is crucial to understand whether these programs can enhance metacognitive ability as a transferable skill across distinct domains or whether unique domains should be targeted separately.

Keywords: confidence ratings, development, metacognitive efficiency, reading acquisition

Procedia PDF Downloads 128
97 Honneth, Feenberg, and the Redemption of Critical Theory of Technology

Authors: David Schafer

Abstract:

Critical Theory is in sore need of a workable account of technology. It had one in the writings of Herbert Marcuse, or so it seemed until Jürgen Habermas mounted a critique in 'Technology and Science as Ideology' (Habermas, 1970) that decisively put it away. Ever since Marcuse’s work has been regarded outdated – a 'philosophy of consciousness' no longer seriously tenable. But with Marcuse’s view has gone the important insight that technology is no norm-free system (as Habermas portrays it) but can be laden with social bias. Andrew Feenberg is among a few serious scholars who have perceived this problem in post-Habermasian critical theory and has sought to revive a basically Marcusean account of technology. On his view, while so-called ‘technical elements’ that physically make up technologies are neutral with regard to social interests, there is a sense in which we may speak of a normative grammar or ‘technical code’ built-in to technology that can be socially biased in favor of certain groups over others (Feenberg, 2002). According to Feenberg, those perspectives on technology are reified which consider technology only by their technical elements to the neglect of their technical codes. Nevertheless, Feenberg’s account fails to explain what is normatively problematic with such reified views of technology. His plausible claim that they represent false perspectives on technology by itself does not explain how such views may be oppressive, even though Feenberg surely wants to be doing that stronger level of normative theorizing. Perceiving this deficit in his own account of reification, he tries to adopt Habermas’s version of systems-theory to ground his own critical theory of technology (Feenberg, 1999). But this is a curious move in light of Feenberg’s own legitimate critiques of Habermas’s portrayals of technology as reified or ‘norm-free.’ This paper argues that a better foundation may be found in Axel Honneth’s recent text, Freedom’s Right (Honneth, 2014). Though Honneth there says little explicitly about technology, he offers an implicit account of reification formulated in opposition to Habermas’s systems-theoretic approach. On this ‘normative functionalist’ account of reification, social spheres are reified when participants prioritize individualist ideals of freedom (moral and legal freedom) to the neglect of an intersubjective form of freedom-through-recognition that Honneth calls ‘social freedom.’ Such misprioritization is ultimately problematic because it is unsustainable: individual freedom is philosophically and institutionally dependent upon social freedom. The main difficulty in adopting Honneth’s social theory for the purposes of a theory of technology, however, is that the notion of social freedom is predicable only of social institutions, whereas it appears difficult to conceive of technology as an institution. Nevertheless, in light of Feenberg’s work, the idea that technology includes within itself a normative grammar (technical code) takes on much plausibility. To the extent that this normative grammar may be understood by the category of social freedom, Honneth’s dialectical account of the relationship between individual and social forms of freedom provides a more solid basis from which to ground the normative claims of Feenberg’s sociological account of technology than Habermas’s systems theory.

Keywords: Habermas, Honneth, technology, Feenberg

Procedia PDF Downloads 168
96 Development of an Automatic Computational Machine Learning Pipeline to Process Confocal Fluorescence Images for Virtual Cell Generation

Authors: Miguel Contreras, David Long, Will Bachman

Abstract:

Background: Microscopy plays a central role in cell and developmental biology. In particular, fluorescence microscopy can be used to visualize specific cellular components and subsequently quantify their morphology through development of virtual-cell models for study of effects of mechanical forces on cells. However, there are challenges with these imaging experiments, which can make it difficult to quantify cell morphology: inconsistent results, time-consuming and potentially costly protocols, and limitation on number of labels due to spectral overlap. To address these challenges, the objective of this project is to develop an automatic computational machine learning pipeline to predict cellular components morphology for virtual-cell generation based on fluorescence cell membrane confocal z-stacks. Methods: Registered confocal z-stacks of nuclei and cell membrane of endothelial cells, consisting of 20 images each, were obtained from fluorescence confocal microscopy and normalized through software pipeline for each image to have a mean pixel intensity value of 0.5. An open source machine learning algorithm, originally developed to predict fluorescence labels on unlabeled transmitted light microscopy cell images, was trained using this set of normalized z-stacks on a single CPU machine. Through transfer learning, the algorithm used knowledge acquired from its previous training sessions to learn the new task. Once trained, the algorithm was used to predict morphology of nuclei using normalized cell membrane fluorescence images as input. Predictions were compared to the ground truth fluorescence nuclei images. Results: After one week of training, using one cell membrane z-stack (20 images) and corresponding nuclei label, results showed qualitatively good predictions on training set. The algorithm was able to accurately predict nuclei locations as well as shape when fed only fluorescence membrane images. Similar training sessions with improved membrane image quality, including clear lining and shape of the membrane, clearly showing the boundaries of each cell, proportionally improved nuclei predictions, reducing errors relative to ground truth. Discussion: These results show the potential of pre-trained machine learning algorithms to predict cell morphology using relatively small amounts of data and training time, eliminating the need of using multiple labels in immunofluorescence experiments. With further training, the algorithm is expected to predict different labels (e.g., focal-adhesion sites, cytoskeleton), which can be added to the automatic machine learning pipeline for direct input into Principal Component Analysis (PCA) for generation of virtual-cell mechanical models.

Keywords: cell morphology prediction, computational machine learning, fluorescence microscopy, virtual-cell models

Procedia PDF Downloads 179
95 The Confluence between Autism Spectrum Disorder and the Schizoid Personality

Authors: Murray David Schane

Abstract:

Though years of clinical encounters with patients with autism spectrum disorders and those with a schizoid personality the many defining diagnostic features shared between these conditions have been explored and current neurobiological differences have been reviewed; and, critical and different treatment strategies for each have been devised. The paper compares and contrasts the apparent similarities between autism spectrum disorders and the schizoid personality are found in these DSM descriptive categories: restricted range of social-emotional reciprocity; poor non-verbal communicative behavior in social interactions; difficulty developing and maintaining relationships; detachment from social relationships; lack of the desire for or enjoyment of close relationships; and preference for solitary activities. In this paper autism, fundamentally a communicative disorder, is revealed to present clinically as a pervasive aversive response to efforts to engage with or be engaged by others. Autists with the Asperger presentation typically have language but have difficulty understanding humor, irony, sarcasm, metaphoric speech, and even narratives about social relationships. They also tend to seek sameness, possibly to avoid problems of social interpretation. Repetitive behaviors engage many autists as a screen against ambient noise, social activity, and challenging interactions. Also in this paper, the schizoid personality is revealed as a pattern of social avoidance, self-sufficiency and apparent indifference to others as a complex psychological defense against a deep, long-abiding fear of appropriation and perverse manipulation. Neither genetic nor MRI studies have yet located the explanatory data that identifies the cause or the neurobiology of autism. Similarly, studies of the schizoid have yet to group that condition with those found in schizophrenia. Through presentations of clinical examples, the treatment of autists of the Asperger type is revealed to address the autist’s extreme social aversion which also precludes the experience of empathy. Autists will be revealed as forming social attachments but without the capacity to interact with mutual concern. Empathy will be shown be teachable and, as social avoidance relents, understanding of the meaning and signs of empathic needs that autists can recognize and acknowledge. Treatment of schizoids will be shown to revolve around joining empathically with the schizoid’s apprehensions about interpersonal, interactive proximity. Models of both autism and schizoid personality traits have yet to be replicated in animals, thereby eliminating the role of translational research in providing the kind of clues to behavioral patterns that can be related to genetic, epigenetic and neurobiological measures. But as these clinical examples will attest, treatment strategies have significant impact.

Keywords: autism spectrum, schizoid personality traits, neurobiological implications, critical diagnostic distinctions

Procedia PDF Downloads 94
94 Assessment of Surface Water Quality near Landfill Sites Using a Water Pollution Index

Authors: Alejandro Cittadino, David Allende

Abstract:

Landfilling of municipal solid waste is a common waste management practice in Argentina as in many parts of the world. There is extensive scientific literature on the potential negative effects of landfill leachates on the environment, so it’s necessary to be rigorous with the control and monitoring systems. Due to the specific municipal solid waste composition in Argentina, local landfill leachates contain large amounts of organic matter (biodegradable, but also refractory to biodegradation), as well as ammonia-nitrogen, small trace of some heavy metals, and inorganic salts. In order to investigate the surface water quality in the Reconquista river adjacent to the Norte III landfill, water samples both upstream and downstream the dumpsite are quarterly collected and analyzed for 43 parameters including organic matter, heavy metals, and inorganic salts, as required by the local standards. The objective of this study is to apply a water quality index that considers the leachate characteristics in order to determine the quality status of the watercourse through the landfill. The water pollution index method has been widely used in water quality assessments, particularly rivers, and it has played an increasingly important role in water resource management, since it provides a number simple enough for the public to understand, that states the overall water quality at a certain location and time. The chosen water quality index (ICA) is based on the values of six parameters: dissolved oxygen (in mg/l and percent saturation), temperature, biochemical oxygen demand (BOD5), ammonia-nitrogen and chloride (Cl-) concentration. The index 'ICA' was determined both upstream and downstream the Reconquista river, being the rating scale between 0 (very poor water quality) and 10 (excellent water quality). The monitoring results indicated that the water quality was unaffected by possible leachate runoff since the index scores upstream and downstream were ranked in the same category, although in general, most of the samples were classified as having poor water quality according to the index’s scale. The annual averaged ICA index scores (computed quarterly) were 4.9, 3.9, 4.4 and 5.0 upstream and 3.9, 5.0, 5.1 and 5.0 downstream the river during the study period between 2014 and 2017. Additionally, the water quality seemed to exhibit distinct seasonal variations, probably due to annual precipitation patterns in the study area. The ICA water quality index appears to be appropriate to evaluate landfill impacts since it accounts mainly for organic pollution and inorganic salts and the absence of heavy metals in the local leachate composition, however, the inclusion of other parameters could be more decisive in discerning the affected stream reaches from the landfill activities. A future work may consider adding to the index other parameters like total organic carbon (TOC) and total suspended solids (TSS) since they are present in the leachate in high concentrations.

Keywords: landfill, leachate, surface water, water quality index

Procedia PDF Downloads 129
93 Learning Recomposition after the Remote Period with Finalist Students of the Technical Course in the Environment of the Ifpa, Paragominas Campus, Pará State, Brazilian Amazon

Authors: Liz Carmem Silva-Pereira, Raffael Alencar Mesquita Rodrigues, Francisco Helton Mendes Barbosa, Emerson de Freitas Ferreira

Abstract:

Due to the Covid-19 pandemic declared in March 2020 by the World Health Organization, the way of social coexistence across the planet was affected, especially in educational processes, from the implementation of the remote modality as a teaching strategy. This teaching-learning modality caused a change in the routine and learning of basic education students, which resulted in serious consequences for the return to face-to-face teaching in 2021. 2022, at the Federal Institute of Education, Science and Technology of Pará (IFPA) – Campus Paragominas had their training process severely affected, having studied the initial half of their training in the remote modality, which compromised the carrying out of practical classes, technical visits and field classes, essential for the student formation on the environmental technician. With the objective of promoting the recomposition of these students' learning after returning to the face-to-face modality, an educational strategy was developed in the last period of the course. As teaching methodologies were used for research as an educational principle, the integrative project and the parallel recovery action applied jointly, aiming at recomposing the basic knowledge of the natural sciences, together with the technical knowledge of the environmental area applied to the course. The project assisted 58 finalist students of the environmental technical course. A research instrument was elaborated with parameters of evaluation of the environmental quality for study in 19 collection points, in the Uraim River urban hydrographic basin, in the Paragominas City – Pará – Brazilian Amazon. Students were separated into groups under the professors' and laboratory assistants’ orientation, and in the field, they observed and evaluated the places' environmental conditions and collected physical data and water samples, which were taken to the chemistry and biology laboratories at Campus Paragominas for further analysis. With the results obtained, each group prepared a technical report on the environmental conditions of each evaluated point. This work methodology enabled the practical application of theoretical knowledge received in various disciplines during the remote teaching modality, contemplating the integration of knowledge, people, skills, and abilities for the best technical training of finalist students. At the activity end, the satisfaction of the involved students in the project was evaluated, through a form, with the signing of the informed consent term, using the Likert scale as an evaluation parameter. The results obtained in the satisfaction survey were: on the use of research projects within the disciplines attended, 82% of satisfaction was obtained; regarding the revision of contents in the execution of the project, 84% of satisfaction was obtained; regarding the acquired field experience, 76.9% of satisfaction was obtained, regarding the laboratory experience, 86.2% of satisfaction was obtained, and regarding the use of this methodology as parallel recovery, 71.8% was obtained of satisfaction. In addition to the excellent performance of students in acquiring knowledge, it was possible to remedy the deficiencies caused by the absence of practical classes, technical visits, and field classes, which occurred during the execution of the remote teaching modality, fulfilling the desired educational recomposition.

Keywords: integrative project, parallel recovery, research as an educational principle, teaching-learning

Procedia PDF Downloads 39
92 Characterization of Thin Woven Composites Used in Printed Circuit Boards by Combining Numerical and Experimental Approaches

Authors: Gautier Girard, Marion Martiny, Sebastien Mercier, Mohamad Jrad, Mohamed-Slim Bahi, Laurent Bodin, Francois Lechleiter, David Nevo, Sophie Dareys

Abstract:

Reliability of electronic devices has always been of highest interest for Aero-MIL and space applications. In any electronic device, Printed Circuit Board (PCB), providing interconnection between components, is a key for reliability. During the last decades, PCB technologies evolved to sustain and/or fulfill increased original equipment manufacturers requirements and specifications, higher densities and better performances, faster time to market and longer lifetime, newer material and mixed buildups. From the very beginning of the PCB industry up to recently, qualification, experiments and trials, and errors were the most popular methods to assess system (PCB) reliability. Nowadays OEM, PCB manufacturers and scientists are working together in a close relationship in order to develop predictive models for PCB reliability and lifetime. To achieve that goal, it is fundamental to characterize precisely base materials (laminates, electrolytic copper, …), in order to understand failure mechanisms and simulate PCB aging under environmental constraints by means of finite element method for example. The laminates are woven composites and have thus an orthotropic behaviour. The in-plane properties can be measured by combining classical uniaxial testing and digital image correlation. Nevertheless, the out-of-plane properties cannot be evaluated due to the thickness of the laminate (a few hundred of microns). It has to be noted that the knowledge of the out-of-plane properties is fundamental to investigate the lifetime of high density printed circuit boards. A homogenization method combining analytical and numerical approaches has been developed in order to obtain the complete elastic orthotropic behaviour of a woven composite from its precise 3D internal structure and its experimentally measured in-plane elastic properties. Since the mechanical properties of the resin surrounding the fibres are unknown, an inverse method is proposed to estimate it. The methodology has been applied to one laminate used in hyperfrequency spatial applications in order to get its elastic orthotropic behaviour at different temperatures in the range [-55°C; +125°C]. Next; numerical simulations of a plated through hole in a double sided PCB are performed. Results show the major importance of the out-of-plane properties and the temperature dependency of these properties on the lifetime of a printed circuit board. Acknowledgements—The support of the French ANR agency through the Labcom program ANR-14-LAB7-0003-01, support of CNES, Thales Alenia Space and Cimulec is acknowledged.

Keywords: homogenization, orthotropic behaviour, printed circuit board, woven composites

Procedia PDF Downloads 174
91 Pre-Cancerigene Injuries Related to Human Papillomavirus: Importance of Cervicography as a Complementary Diagnosis Method

Authors: Denise De Fátima Fernandes Barbosa, Tyane Mayara Ferreira Oliveira, Diego Jorge Maia Lima, Paula Renata Amorim Lessa, Ana Karina Bezerra Pinheiro, Cintia Gondim Pereira Calou, Glauberto Da Silva Quirino, Hellen Lívia Oliveira Catunda, Tatiana Gomes Guedes, Nicolau Da Costa

Abstract:

The aim of this study is to evaluate the use of Digital Cervicography (DC) in the diagnosis of precancerous lesions related to Human Papillomavirus (HPV). Cross-sectional study with a quantitative approach, of evaluative type, held in a health unit linked to the Pro Dean of Extension of the Federal University of Ceará, in the period of July to August 2015 with a sample of 33 women. Data collecting was conducted through interviews with enforcement tool. Franco (2005) standardized the technique used for DC. Polymerase Chain Reaction (PCR) was performed to identify high-risk HPV genotypes. DC were evaluated and classified by 3 judges. The results of DC and PCR were classified as positive, negative or inconclusive. The data of the collecting instruments were compiled and analyzed by the software Statistical Package for Social Sciences (SPSS) with descriptive statistics and cross-references. Sociodemographic, sexual and reproductive variables were analyzed through absolute frequencies (N) and their respective percentage (%). Kappa coefficient (κ) was applied to determine the existence of agreement between the DC of reports among evaluators with PCR and also among the judges about the DC results. The Pearson's chi-square test was used for analysis of sociodemographic, sexual and reproductive variables with the PCR reports. It was considered statistically significant (p<0.05). Ethical aspects of research involving human beings were respected, according to 466/2012 Resolution. Regarding the socio-demographic profile, the most prevalent ages and equally were those belonging to the groups 21-30 and 41-50 years old (24.2%). The brown color was reported in excess (84.8%) and 96.9% out of them had completed primary and secondary school or studying. 51.5% were married, 72.7% Catholic, 54.5% employed and 48.5% with income between one and two minimum wages. As for the sexual and reproductive characteristics, prevailed heterosexual (93.9%) who did not use condoms during sexual intercourse (72.7%). 51.5% had a previous history of Sexually Transmitted Infection (STI), and HPV the most prevalent STI (76.5%). 57.6% did not use contraception, 78.8% underwent examination Cancer Prevention Uterus (PCCU) with shorter time interval or equal to one year, 72.7% had no cases of Cervical Cancer in the family, 63.6% were multiparous and 97% were not vaccinated against HPV. DC identified good level of agreement between raters (κ=0.542), had a specificity of 77.8% and sensitivity of 25% when compared their results with PCR. Only the variable race showed a statistically significant association with CRP (p=0.042). DC had 100% acceptance amongst women in the sample, revealing the possibility of other experiments in using this method so that it proves as a viable technique. The DC positivity criteria were developed by nurses and these professionals also perform PCCU in Brazil, which means that DC can be an important complementary diagnostic method for the appreciation of these professional’s quality of examinations.

Keywords: gynecological examination, human papillomavirus, nursing, papillomavirus infections, uterine lasmsneop

Procedia PDF Downloads 274
90 Development of Bilayer Coating System for Mitigating Corrosion of Offshore Wind Turbines

Authors: Adamantini Loukodimou, David Weston, Shiladitya Paul

Abstract:

Offshore structures are subjected to harsh environments. It is documented that carbon steel needs protection from corrosion. The combined effect of UV radiation, seawater splash, and fluctuating temperatures diminish the integrity of these structures. In addition, the possibility of damage caused by floating ice, seaborne debris, and maintenance boats make them even more vulnerable. Their inspection and maintenance when far out in the sea are difficult, risky, and expensive. The most known method of mitigating corrosion of offshore structures is the use of cathodic protection. There are several zones in an offshore wind turbine. In the atmospheric zone, due to the lack of a continuous electrolyte (seawater) layer between the structure and the anode at all times, this method proves inefficient. Thus, the use of protective coatings becomes indispensable. This research focuses on the atmospheric zone. The conversion of commercially available and conventional paint (epoxy) system to an autonomous self-healing paint system via the addition of suitable encapsulated healing agents and catalyst is investigated in this work. These coating systems, which can self-heal when damaged, can provide a cost-effective engineering solution to corrosion and related problems. When the damage of the paint coating occurs, the microcapsules are designed to rupture and release the self-healing liquid (monomer), which then will react in the presence of the catalyst and solidify (polymerization), resulting in healing. The catalyst should be compatible with the system because otherwise, the self-healing process will not occur. The carbon steel substrate will be exposed to a corrosive environment, so the use of a sacrificial layer of Zn is also investigated. More specifically, the first layer of this new coating system will be TSZA (Thermally Sprayed Zn85/Al15) and will be applied on carbon steel samples with dimensions 100 x 150 mm after being blasted with alumina (size F24) as part of the surface preparation. Based on the literature, it corrodes readily, so one additional paint layer enriched with microcapsules will be added. Also, the reaction and the curing time are of high importance in order for this bilayer system of coating to work successfully. For the first experiments, polystyrene microcapsules loaded with 3-octanoyltio-1-propyltriethoxysilane were conducted. Electrochemical experiments such as Electrochemical Impedance Spectroscopy (EIS) confirmed the corrosion inhibiting properties of the silane. The diameter of the microcapsules was about 150-200 microns. Further experiments were conducted with different reagents and methods in order to obtain diameters of about 50 microns, and their self-healing properties were tested in synthetic seawater using electrochemical techniques. The use of combined paint/electrodeposited coatings allows for further novel development of composite coating systems. The potential for the application of these coatings in offshore structures will be discussed.

Keywords: corrosion mitigation, microcapsules, offshore wind turbines, self-healing

Procedia PDF Downloads 93
89 Finite Element Analysis of the Anaconda Device: Efficiently Predicting the Location and Shape of a Deployed Stent

Authors: Faidon Kyriakou, William Dempster, David Nash

Abstract:

Abdominal Aortic Aneurysm (AAA) is a major life-threatening pathology for which modern approaches reduce the need for open surgery through the use of stenting. The success of stenting though is sometimes jeopardized by the final position of the stent graft inside the human artery which may result in migration, endoleaks or blood flow occlusion. Herein, a finite element (FE) model of the commercial medical device AnacondaTM (Vascutek, Terumo) has been developed and validated in order to create a numerical tool able to provide useful clinical insight before the surgical procedure takes place. The AnacondaTM device consists of a series of NiTi rings sewn onto woven polyester fabric, a structure that despite its column stiffness is flexible enough to be used in very tortuous geometries. For the purposes of this study, a FE model of the device was built in Abaqus® (version 6.13-2) with the combination of beam, shell and surface elements; the choice of these building blocks was made to keep the computational cost to a minimum. The validation of the numerical model was performed by comparing the deployed position of a full stent graft device inside a constructed AAA with a duplicate set-up in Abaqus®. Specifically, an AAA geometry was built in CAD software and included regions of both high and low tortuosity. Subsequently, the CAD model was 3D printed into a transparent aneurysm, and a stent was deployed in the lab following the steps of the clinical procedure. Images on the frontal and sagittal planes of the experiment allowed the comparison with the results of the numerical model. By overlapping the experimental and computational images, the mean and maximum distances between the rings of the two models were measured in the longitudinal, and the transverse direction and, a 5mm upper bound was set as a limit commonly used by clinicians when working with simulations. The two models showed very good agreement of their spatial positioning, especially in the less tortuous regions. As a result, and despite the inherent uncertainties of a surgical procedure, the FE model allows confidence that the final position of the stent graft, when deployed in vivo, can also be predicted with significant accuracy. Moreover, the numerical model run in just a few hours, an encouraging result for applications in the clinical routine. In conclusion, the efficient modelling of a complicated structure which combines thin scaffolding and fabric has been demonstrated to be feasible. Furthermore, the prediction capabilities of the location of each stent ring, as well as the global shape of the graft, has been shown. This can allow surgeons to better plan their procedures and medical device manufacturers to optimize their designs. The current model can further be used as a starting point for patient specific CFD analysis.

Keywords: AAA, efficiency, finite element analysis, stent deployment

Procedia PDF Downloads 170
88 Ecological and Historical Components of the Cultural Code of the City of Florence as Part of the Edutainment Project Velonotte International

Authors: Natalia Zhabo, Sergey Nikitin, Marina Avdonina, Mariya Nikitina

Abstract:

The analysis of the activities of one of the events of the international educational and entertainment project Velonotte is provided: an evening bicycle tour with children around Florence. The aim of the project is to develop methods and techniques for increasing the sensitivity of the cycling participants and listeners of the radio broadcasts to the treasures of the national heritage, in this case, to the historical layers of the city and the ecology of the Renaissance epoch. The block of educational tasks is considered, and the issues of preserving the identity of the city are discussed. Methods. The Florentine event was prepared during more than a year. First of all the creative team selected such events of the history of the city which seem to be important for revealing the specifics of the city, its spirit - from antiquity to our days – including the forums of Internet with broad public opinion. Then a route (seven kilometers) was developed, which was proposed to the authorities and organizations of the city. The selection of speakers was conducted according to several criteria: they should be authors of books, famous scientists, connoisseurs in a certain sphere (toponymy, history of urban gardens, art history), capable and willing to talk with participants directly at the points of stops, in order to make a dialogue and so that performances could be organized with their participation. The music was chosen for each part of the itinerary to prepare the audience emotionally. Cards for coloring with images of the main content of each stop were created for children. A site was done to inform the participants and to keep photos, videos and the audio files with speakers’ speech afterward. Results: Held in April 2017, the event was dedicated to the 640th Anniversary of the Filippo Brunelleschi, Florentine architect, and to the 190th anniversary of the publication of Florence guide by Stendhal. It was supported by City of Florence and Florence Bike Festival. Florence was explored to transfer traditional elements of culture, sometimes unfairly forgotten from ancient times to Brunelleschi and Michelangelo and Tschaikovsky and David Bowie with lectures by professors of Universities. Memorable art boards were installed in public spaces. Elements of the cultural code are deeply internalized in the minds of the townspeople, the perception of the city in everyday life and human communication is comparable to such fundamental concepts of the self-awareness of the townspeople as mental comfort and the level of happiness. The format of a fun and playful walk with the ICT support gives new opportunities for enriching the city's cultural code of each citizen with new components, associations, connotations.

Keywords: edutainment, cultural code, cycling, sensitization Florence

Procedia PDF Downloads 194
87 The Roots of Amazonia’s Droughts and Floods: Complex Interactions of Pacific and Atlantic Sea-Surface Temperatures

Authors: Rosimeire Araújo Silva, Philip Martin Fearnside

Abstract:

Extreme droughts and floods in the Amazon have serious consequences for natural ecosystems and the human population in the region. The frequency of these events has increased in recent years, and projections of climate change predict greater frequency and intensity of these events. Understanding the links between these extreme events and different patterns of sea surface temperature in the Atlantic and Pacific Oceans is essential, both to improve the modeling of climate change and its consequences and to support efforts of adaptation in the region. The relationship between sea temperatures and events in the Amazon is much more complex than is usually assumed in climatic models. Warming and cooling of different parts of the oceans, as well as the interaction between simultaneous temperature changes in different parts of each ocean and between the two oceans, have specific consequences for the Amazon, with effects on precipitation that vary in different parts of the region. Simplistic generalities, such as the association between El Niño events and droughts in the Amazon, do not capture this complexity. We investigated the variability of Sea Surface Temperature (SST) in the Tropical Pacific Ocean during the period 1950-2022, using Empirical Orthogonal Functions (FOE), spectral analysis coherence and wavelet phase. The two were identified as the main modes of variability, which explain about 53,9% and 13,3%, respectively, of the total variance of the data. The spectral and coherence analysis and wavelets phase showed that the first selected mode represents the warming in the central part of the Pacific Ocean (the “Central El Niño”), while the second mode represents warming in the eastern part of the Pacific (the “Eastern El Niño The effects of the 1982-1983 and 1976-1977 El Niño events in the Amazon, although both events were characterized by an increase in sea surface temperatures in the Equatorial Pacific, the impact on rainfall in the Amazon was distinct. In the rainy season, from December to March, the sub-basins of the Japurá, Jutaí, Jatapu, Tapajós, Trombetas and Xingu rivers were the regions that showed the greatest reductions in rainfall associated with El Niño Central (1982-1983), while the sub-basins of the Javari, Purus, Negro and Madeira rivers had the most pronounced reductions in the year of Eastern El Niño (1976-1977). In the transition to the dry season, in April, the greatest reductions were associated with the Eastern El Niño year for the majority of the study region, with the exception only of the sub-basins of the Madeira, Trombetas and Xingu rivers, which had their associated reductions to Central El Niño. In the dry season from July to September, the sub-basins of the Japurá Jutaí Jatapu Javari Trombetas and Madeira rivers were the rivers that showed the greatest reductions in rainfall associated with El Niño Central, while the sub-basins of the Tapajós Purus Negro and Xingu rivers had the most pronounced reductions. In the Eastern El Niño year this season. In this way, it is possible to conclude that the Central (Eastern) El Niño controlled the reductions in soil moisture in the dry (rainy) season for all sub-basins shown in this study. Extreme drought events associated with these meteorological phenomena can lead to a significant increase in the occurrence of forest fires. These fires have a devastating impact on Amazonian vegetation, resulting in the irreparable loss of biodiversity and the release of large amounts of carbon stored in the forest, contributing to the increase in the greenhouse effect and global climate change.

Keywords: sea surface temperature, variability, climate, Amazon

Procedia PDF Downloads 39
86 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression

Authors: Anne M. Denton, Rahul Gomes, David W. Franzen

Abstract:

High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.

Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression

Procedia PDF Downloads 104
85 Interacting with Multi-Scale Structures of Online Political Debates by Visualizing Phylomemies

Authors: Quentin Lobbe, David Chavalarias, Alexandre Delanoe

Abstract:

The ICT revolution has given birth to an unprecedented world of digital traces and has impacted a wide number of knowledge-driven domains such as science, education or policy making. Nowadays, we are daily fueled by unlimited flows of articles, blogs, messages, tweets, etc. The internet itself can thus be considered as an unsteady hyper-textual environment where websites emerge and expand every day. But there are structures inside knowledge. A given text can always be studied in relation to others or in light of a specific socio-cultural context. By way of their textual traces, human beings are calling each other out: hypertext citations, retweets, vocabulary similarity, etc. We are in fact the architects of a giant web of elements of knowledge whose structures and shapes convey their own information. The global shapes of these digital traces represent a source of collective knowledge and the question of their visualization remains an opened challenge. How can we explore, browse and interact with such shapes? In order to navigate across these growing constellations of words and texts, interdisciplinary innovations are emerging at the crossroad between fields of social and computational sciences. In particular, complex systems approaches make it now possible to reconstruct the hidden structures of textual knowledge by means of multi-scale objects of research such as semantic maps and phylomemies. The phylomemy reconstruction is a generic method related to the co-word analysis framework. Phylomemies aim to reveal the temporal dynamics of large corpora of textual contents by performing inter-temporal matching on extracted knowledge domains in order to identify their conceptual lineages. This study aims to address the question of visualizing the global shapes of online political discussions related to the French presidential and legislative elections of 2017. We aim to build phylomemies on top of a dedicated collection of thousands of French political tweets enriched with archived contemporary news web articles. Our goal is to reconstruct the temporal evolution of online debates fueled by each political community during the elections. To that end, we want to introduce an iterative data exploration methodology implemented and tested within the free software Gargantext. There we combine synchronic and diachronic axis of visualization to reveal the dynamics of our corpora of tweets and web pages as well as their inner syntagmatic and paradigmatic relationships. In doing so, we aim to provide researchers with innovative methodological means to explore online semantic landscapes in a collaborative and reflective way.

Keywords: online political debate, French election, hyper-text, phylomemy

Procedia PDF Downloads 169