Search results for: Andreas Christoph Weber
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 228

Search results for: Andreas Christoph Weber

18 Development of an Instrument Assessing Participants’ Motivation on Assigning Monetary Value to Quality of Life

Authors: Afentoula Mavrodi, Andreas Georgiou, Georgios Tsiotras, Vassilis Aletras

Abstract:

Placing a monetary value on a quality-adjusted-life-year (QALY) is of utmost importance in economic evaluation. Identifying the population’s preferences is critical in order to understand some of the reasons driving variations in the assigned monetary value. Yet, evidence of the motives behind value assignment to a QALY by the general public is limited. Developing an instrument that would capture the population’s motives could be proven valuable to policy-makers, to guide them in allocating different values to a QALY based on users’ motivations. The aim of this study was to identify the most relevant motives and develop an appropriate instrument to assess them. To design the instrument, we employed: a) the EQ-5D-3L tool to assess participants’ current health status, and b) the Willingness-to-Pay (WTP) approach, within the Contingent Valuation (CV) Method framework, to elicit the monetary value. Advancing the open-ended approach adopted to assess solely protest bidders’ motives; a variety of follow-up item-specific statements were designed (deductive approach), aiming to evaluate motives of both protest bidders and participants willing to pay for the hypothetical treatment under consideration. The initial design of the survey instrument was the outcome of an extensive literature review. This instrument was revised based on 15 semi-structured interviews that took place in September 2018 and a pilot study held during two months (October-November) in 2018. Individuals with different educational, occupational and economical backgrounds and adequate verbal skills were recruited to complete the semi-structured interviews. The follow-up motivation statements of both protest bidders and those willing to pay were revised and rephrased after the semi-structured interviews. In total 4 statements for protest bidders and 3 statements for those willing to pay for the treatment were chosen to be included in the survey tool. Using the CATI (Computer Assisted Telephone Interview) method, a randomly selected sample of 97 persons living in Thessaloniki, Greece, completed the questionnaire on two occasions over a period of 4 weeks. Based on pilot study results, a test-retest reliability assessment was performed using the intra-class correlation coefficient (ICC). All statements formulated for protest bidders showed acceptable reliability (ICC values of 0.84 (95% CI: 0.67, 0.92) and above). Similarly, all statements for those willing to pay for the treatment showed high reliability (ICC values of 0.86 (95% CI: 0.78, 0.91) and above). Overall, the instrument designed in this study was reliable with regards to the item-specific statements assessing participants’ motivation. Validation of the instrument will take place in a future study. For a holistic WTP per QALY instrument, participants’ motivation must be addressed broadly. The instrument developed in this study captured a variety of motives and provided insight with regards to the method through which the latter are evaluated. Last but not least, it extended motive assessment to all study participants and not only protest bidders.

Keywords: contingent valuation method, instrument, motives, quality-adjusted life-year, willingness-to-pay

Procedia PDF Downloads 111
17 Contribution of Research to Innovation Management in the Traditional Fruit Production

Authors: Camille Aouinaït, Danilo Christen, Christoph Carlen

Abstract:

Introduction: Small and Medium-sized Enterprises (SMEs) are facing different challenges such as pressures on environmental resources, the rise of downstream power, and trade liberalization. Remaining competitive by implementing innovations and engaging in collaborations could be a strategic solution. In Switzerland, the Federal Institute for Research in Agriculture (Agroscope), the Federal schools of technology (EPFL and ETHZ), Cantonal universities and Universities of Applied Sciences (UAS) can provide substantial inputs. UAS were developed with specific missions to match the labor markets and society needs. Research projects produce patents, publications and improved networks of scientific expertise. The study’s goal is to measure the contribution of UAS and research organization to innovation and the impact of collaborations with partners in the non-academic environment in Swiss traditional fruit production. Materials and methods: The European projects Traditional Food Network to improve the transfer of knowledge for innovation (TRAFOON) and Social Impact Assessment of Productive Interactions between science and society (SIAMPI) frame the present study. The former aims to fill the gap between the needs of traditional food producing SMEs and innovations implemented following European projects. The latter developed a method to assess the impacts of scientific research. On one side, interviews with market players have been performed to make an inventory of needs of Swiss SMEs producing apricots and berries. The participative method allowed matching the current needs and the existing innovations coming from past European projects. Swiss stakeholders (e.g. producers, retailers, an inter-branch organization of fruits and vegetables) directly rated the needs on a five-Likert scale. To transfer the knowledge to SMEs, training workshops have been organized for apricot and berries actors separately, on specific topics. On the other hand, a mapping of a social network is drawn to characterize the links between actors, with a focus on the Swiss canton of Valais and UAS Valais Wallis. Type and frequency of interactions among actors have identified thanks to interviews. Preliminary results: A list of 369 SMEs needs grouped in 22 categories was produced with 37 fulfilled questionnaires. Swiss stakeholders rated 31 needs very important. Training workshops on apricot are focusing on varietal innovations, storage, disease (bacterial blight), pest (Drosophila suzukii), sorting and rootstocks. Entrepreneurship was targeted through trademark discussions in berry production. The UAS Valais Wallis collaborated on a few projects with Agroscope along with industries, at European and national levels. Political and public bodies interfere with the central area of agricultural vulgarization that induces close relationships between the research and the practical side. Conclusions: The needs identified by Swiss stakeholders are becoming part of training workshops to incentivize innovations. The UAS Valais Wallis takes part in collaboration projects with the research environment and market players that bring innovations helping SMEs in their contextual environment. Then, a Strategic Research and Innovation Agenda will be created in order to pursue research and answer the issues facing by SMEs.

Keywords: agriculture, innovation, knowledge transfer, university and research collaboration

Procedia PDF Downloads 362
16 Approach for the Mathematical Calculation of the Damping Factor of Railway Bridges with Ballasted Track

Authors: Andreas Stollwitzer, Lara Bettinelli, Josef Fink

Abstract:

The expansion of the high-speed rail network over the past decades has resulted in new challenges for engineers, including traffic-induced resonance vibrations of railway bridges. Excessive resonance-induced speed-dependent accelerations of railway bridges during high-speed traffic can lead to negative consequences such as fatigue symptoms, distortion of the track, destabilisation of the ballast bed, and potentially even derailment. A realistic prognosis of bridge vibrations during high-speed traffic must not only rely on the right choice of an adequate calculation model for both bridge and train but first and foremost on the use of dynamic model parameters which reflect reality appropriately. However, comparisons between measured and calculated bridge vibrations are often characterised by considerable discrepancies, whereas dynamic calculations overestimate the actual responses and therefore lead to uneconomical results. This gap between measurement and calculation constitutes a complex research issue and can be traced to several causes. One major cause is found in the dynamic properties of the ballasted track, more specifically in the persisting, substantial uncertainties regarding the consideration of the ballasted track (mechanical model and input parameters) in dynamic calculations. Furthermore, the discrepancy is particularly pronounced concerning the damping values of the bridge, as conservative values have to be used in the calculations due to normative specifications and lack of knowledge. By using a large-scale test facility, the analysis of the dynamic behaviour of ballasted track has been a major research topic at the Institute of Structural Engineering/Steel Construction at TU Wien in recent years. This highly specialised test facility is designed for isolated research of the ballasted track's dynamic stiffness and damping properties – independent of the bearing structure. Several mechanical models for the ballasted track consisting of one or more continuous spring-damper elements were developed based on the knowledge gained. These mechanical models can subsequently be integrated into bridge models for dynamic calculations. Furthermore, based on measurements at the test facility, model-dependent stiffness and damping parameters were determined for these mechanical models. As a result, realistic mechanical models of the railway bridge with different levels of detail and sufficiently precise characteristic values are available for bridge engineers. Besides that, this contribution also presents another practical application of such a bridge model: Based on the bridge model, determination equations for the damping factor (as Lehr's damping factor) can be derived. This approach constitutes a first-time method that makes the damping factor of a railway bridge calculable. A comparison of this mathematical approach with measured dynamic parameters of existing railway bridges illustrates, on the one hand, the apparent deviation between normatively prescribed and in-situ measured damping factors. On the other hand, it is also shown that a new approach, which makes it possible to calculate the damping factor, provides results that are close to reality and thus raises potentials for minimising the discrepancy between measurement and calculation.

Keywords: ballasted track, bridge dynamics, damping, model design, railway bridges

Procedia PDF Downloads 140
15 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 73
14 Recovery of Polyphenolic Phytochemicals From Greek Grape Pomace (Vitis Vinifera L.)

Authors: Christina Drosou, Konstantina E. Kyriakopoulou, Andreas Bimpilas, Dimitrios Tsimogiannis, Magdalini C. Krokida

Abstract:

Rationale: Agiorgitiko is one of the most widely-grown and commercially well-established red wine varieties in Greece. Each year viticulture industry produces a large amount of waste consisting of grape skins and seeds (pomace) during a short period. Grapes contain polyphenolic compounds which are partially transferred to wine during winemaking. Therefore, winery wastes could be an alternative cheap source for obtaining such compounds with important antioxidant activity. Specifically, red grape waste contains anthocyanins and flavonols which are characterized by multiple biological activities, including cardioprotective, anti-inflammatory, anti-carcinogenic, antiviral and antibacterial properties attributed mainly to their antioxidant activity. Ultrasound assisted extraction (UAE) is considered an effective way to recover phenolic compounds, since it combines the advantage of mechanical effect with low temperature. Moreover, green solvents can be used in order to recover extracts intended for used in the food and nutraceutical industry. Apart from the extraction, pre-treatment process like drying can play an important role on the preservation of the grape pomace and the enhancement of its antioxidant capacity. Objective: The aim of this study is to recover natural extracts from winery waste with high antioxidant capacity using green solvents so they can be exploited and utilized as enhancers in food or nutraceuticals. Methods: Agiorgitiko grape pomace was dehydrated by air drying (AD) and accelerated solar drying (ASD) in order to explore the effect of the pre-treatment on the recovery of bioactive compounds. UAE was applied in untreated and dried samples using water and water: ethanol (1:1) as solvents. The total antioxidant potential and phenolic content of the extracts was determined using the 1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging assay and Folin-Ciocalteu method, respectively. Finally, the profile of anthocyanins and flavonols was specified using HPLC-DAD analysis. The efficiency of processes was determined in terms of extraction yield, antioxidant activity, phenolic content and the anthocyanins and flavovols profile. Results & Discussion: The experiments indicated that the pre-treatment was essential for the recovery of highly nutritious compounds from the pomace as long as the extracts samples showed higher phenolic content and antioxidant capacity. Water: ethanol (1:1) was considered a more effective solvent on the recovery of phenolic compounds. Moreover, ASD grape pomace extracted with the solvent system exhibited the highest antioxidant activity (IC50=0.36±0.01mg/mL) and phenolic content (TPC=172.68±0.01mgGAE/g dry extract), followed by AD and untreated pomace. The major compounds recovered were malvidin3-O-glucoside and quercetin3-O-glucoside according to the HPLC analysis. Conclusions: Winery waste can be exploited for the recovery of nutritious compounds using green solvents such as water or ethanol. The pretreatment of the pomace can significantly affect the concentration of phenolic compounds, while UAE is considered a highly effective extraction process.

Keywords: agiorgitico grape pomace, antioxidants, phenolic compounds, ultrasound assisted extraction

Procedia PDF Downloads 372
13 Development of a Bead Based Fully Automated Mutiplex Tool to Simultaneously Diagnose FIV, FeLV and FIP/FCoV

Authors: Andreas Latz, Daniela Heinz, Fatima Hashemi, Melek Baygül

Abstract:

Introduction: Feline leukemia virus (FeLV), feline immunodeficiency virus (FIV), and feline coronavirus (FCoV) are serious infectious diseases affecting cats worldwide. Transmission of these viruses occurs primarily through close contact with infected cats (via saliva, nasal secretions, faeces, etc.). FeLV, FIV, and FCoV infections can occur in combination and are expressed in similar clinical symptoms. Diagnosis can therefore be challenging: Symptoms are variable and often non-specific. Sick cats show very similar clinical symptoms: apathy, anorexia, fever, immunodeficiency syndrome, anemia, etc. Sample volume for small companion animals for diagnostic purposes can be challenging to collect. In addition, multiplex diagnosis of diseases can contribute to an easier, cheaper, and faster workflow in the lab as well as to the better differential diagnosis of diseases. For this reason, we wanted to develop a new diagnostic tool that utilizes less sample volume, reagents, and consumables than multiplesingleplex ELISA assays Methods: The Multiplier from Dynextechonogies (USA) has been used as platform to develop a Multiplex diagnostic tool for the detection of antibodies against FIV and FCoV/FIP and antigens for FeLV. Multiplex diagnostics. The Dynex®Multiplier®is a fully automated chemiluminescence immunoassay analyzer that significantly simplifies laboratory workflow. The Multiplier®ease-of-use reduces pre-analytical steps by combining the power of efficiently multiplexing multiple assays with the simplicity of automated microplate processing. Plastic beads have been coated with antigens for FIV and FCoV/FIP, as well as antibodies for FeLV. Feline blood samples are incubated with the beads. Read out of results is performed via chemiluminescence Results: Bead coating was optimized for each individual antigen or capture antibody and then combined in the multiplex diagnostic tool. HRP: Antibody conjugates for FIV and FCoV antibodies, as well as detection antibodies for FeLV antigen, have been adjusted and mixed. 3 individual prototyple batches of the assay have been produced. We analyzed for each disease 50 well defined positive and negative samples. Results show an excellent diagnostic performance of the simultaneous detection of antibodies or antigens against these feline diseases in a fully automated system. A 100% concordance with singleplex methods like ELISA or IFA can be observed. Intra- and Inter-Assays showed a high precision of the test with CV values below 10% for each individual bead. Accelerated stability testing indicate a shelf life of at least 1 year. Conclusion: The new tool can be used for multiplex diagnostics of the most important feline infectious diseases. Only a very small sample volume is required. Fully automation results in a very convenient and fast method for diagnosing animal diseases.With its large specimen capacity to process over 576 samples per 8-hours shift and provide up to 3,456 results, very high laboratory productivity and reagent savings can be achieved.

Keywords: Multiplex, FIV, FeLV, FCoV, FIP

Procedia PDF Downloads 73
12 Surviral: An Agent-Based Simulation Framework for Sars-Cov-2 Outcome Prediction

Authors: Sabrina Neururer, Marco Schweitzer, Werner Hackl, Bernhard Tilg, Patrick Raudaschl, Andreas Huber, Bernhard Pfeifer

Abstract:

History and the current outbreak of Covid-19 have shown the deadly potential of infectious diseases. However, infectious diseases also have a serious impact on areas other than health and healthcare, such as the economy or social life. These areas are strongly codependent. Therefore, disease control measures, such as social distancing, quarantines, curfews, or lockdowns, have to be adopted in a very considerate manner. Infectious disease modeling can support policy and decision-makers with adequate information regarding the dynamics of the pandemic and therefore assist in planning and enforcing appropriate measures that will prevent the healthcare system from collapsing. In this work, an agent-based simulation package named “survival” for simulating infectious diseases is presented. A special focus is put on SARS-Cov-2. The presented simulation package was used in Austria to model the SARS-Cov-2 outbreak from the beginning of 2020. Agent-based modeling is a relatively recent modeling approach. Since our world is getting more and more complex, the complexity of the underlying systems is also increasing. The development of tools and frameworks and increasing computational power advance the application of agent-based models. For parametrizing the presented model, different data sources, such as known infections, wastewater virus load, blood donor antibodies, circulating virus variants and the used capacity for hospitalization, as well as the availability of medical materials like ventilators, were integrated with a database system and used. The simulation result of the model was used for predicting the dynamics and the possible outcomes and was used by the health authorities to decide on the measures to be taken in order to control the pandemic situation. The survival package was implemented in the programming language Java and the analytics were performed with R Studio. During the first run in March 2020, the simulation showed that without measures other than individual personal behavior and appropriate medication, the death toll would have been about 27 million people worldwide within the first year. The model predicted the hospitalization rates (standard and intensive care) for Tyrol and South Tyrol with an accuracy of about 1.5% average error. They were calculated to provide 10-days forecasts. The state government and the hospitals were provided with the 10-days models to support their decision-making. This ensured that standard care was maintained for as long as possible without restrictions. Furthermore, various measures were estimated and thereafter enforced. Among other things, communities were quarantined based on the calculations while, in accordance with the calculations, the curfews for the entire population were reduced. With this framework, which is used in the national crisis team of the Austrian province of Tyrol, a very accurate model could be created on the federal state level as well as on the district and municipal level, which was able to provide decision-makers with a solid information basis. This framework can be transferred to various infectious diseases and thus can be used as a basis for future monitoring.

Keywords: modelling, simulation, agent-based, SARS-Cov-2, COVID-19

Procedia PDF Downloads 147
11 Multi-Objective Optimization of Assembly Manufacturing Factory Setups

Authors: Andreas Lind, Aitor Iriondo Pascual, Dan Hogberg, Lars Hanson

Abstract:

Factory setup lifecycles are most often described and prepared in CAD environments; the preparation is based on experience and inputs from several cross-disciplinary processes. Early in the factory setup preparation, a so-called block layout is created. The intention is to describe a high-level view of the intended factory setup and to claim area reservations and allocations. Factory areas are then blocked, i.e., targeted to be used for specific intended resources and processes, later redefined with detailed factory setup layouts. Each detailed layout is based on the block layout and inputs from cross-disciplinary preparation processes, such as manufacturing sequence, productivity, workers’ workplace requirements, and resource setup preparation. However, this activity is often not carried out with all variables considered simultaneously, which might entail a risk of sub-optimizing the detailed layout based on manual decisions. Therefore, this work aims to realize a digital method for assembly manufacturing layout planning where productivity, area utilization, and ergonomics can be considered simultaneously in a cross-disciplinary manner. The purpose of the digital method is to support engineers in finding optimized designs of detailed layouts for assembly manufacturing factories, thereby facilitating better decisions regarding setups of future factories. Input datasets are company-specific descriptions of required dimensions for specific area reservations, such as defined dimensions of a worker’s workplace, material façades, aisles, and the sequence to realize the product assembly manufacturing process. To test and iteratively develop the digital method, a demonstrator has been developed with an adaptation of existing software that simulates and proposes optimized designs of detailed layouts. Since the method is to consider productivity, ergonomics, area utilization, and constraints from the automatically generated block layout, a multi-objective optimization approach is utilized. In the demonstrator, the input data are sent to the simulation software industrial path solutions (IPS). Based on the input and Lua scripts, the IPS software generates a block layout in compliance with the company’s defined dimensions of area reservations. Communication is then established between the IPS and the software EPP (Ergonomics in Productivity Platform), including intended resource descriptions, assembly manufacturing process, and manikin (digital human) resources. Using multi-objective optimization approaches, the EPP software then calculates layout proposals that are sent iteratively and simulated and rendered in IPS, following the rules and regulations defined in the block layout as well as productivity and ergonomics constraints and objectives. The software demonstrator is promising. The software can handle several parameters to optimize the detailed layout simultaneously and can put forward several proposals. It can optimize multiple parameters or weight the parameters to fine-tune the optimal result of the detailed layout. The intention of the demonstrator is to make the preparation between cross-disciplinary silos transparent and achieve a common preparation of the assembly manufacturing factory setup, thereby facilitating better decisions.

Keywords: factory setup, multi-objective, optimization, simulation

Procedia PDF Downloads 120
10 Rapid, Automated Characterization of Microplastics Using Laser Direct Infrared Imaging and Spectroscopy

Authors: Andreas Kerstan, Darren Robey, Wesam Alvan, David Troiani

Abstract:

Over the last 3.5 years, Quantum Cascade Lasers (QCL) technology has become increasingly important in infrared (IR) microscopy. The advantages over fourier transform infrared (FTIR) are that large areas of a few square centimeters can be measured in minutes and that the light intensive QCL makes it possible to obtain spectra with excellent S/N, even with just one scan. A firmly established solution of the laser direct infrared imaging (LDIR) 8700 is the analysis of microplastics. The presence of microplastics in the environment, drinking water, and food chains is gaining significant public interest. To study their presence, rapid and reliable characterization of microplastic particles is essential. Significant technical hurdles in microplastic analysis stem from the sheer number of particles to be analyzed in each sample. Total particle counts of several thousand are common in environmental samples, while well-treated bottled drinking water may contain relatively few. While visual microscopy has been used extensively, it is prone to operator error and bias and is limited to particles larger than 300 µm. As a result, vibrational spectroscopic techniques such as Raman and FTIR microscopy have become more popular, however, they are time-consuming. There is a demand for rapid and highly automated techniques to measure particle count size and provide high-quality polymer identification. Analysis directly on the filter that often forms the last stage in sample preparation is highly desirable as, by removing a sample preparation step it can both improve laboratory efficiency and decrease opportunities for error. Recent advances in infrared micro-spectroscopy combining a QCL with scanning optics have created a new paradigm, LDIR. It offers improved speed of analysis as well as high levels of automation. Its mode of operation, however, requires an IR reflective background, and this has, to date, limited the ability to perform direct “on-filter” analysis. This study explores the potential to combine the filter with an infrared reflective surface filter. By combining an IR reflective material or coating on a filter membrane with advanced image analysis and detection algorithms, it is demonstrated that such filters can indeed be used in this way. Vibrational spectroscopic techniques play a vital role in the investigation and understanding of microplastics in the environment and food chain. While vibrational spectroscopy is widely deployed, improvements and novel innovations in these techniques that can increase the speed of analysis and ease of use can provide pathways to higher testing rates and, hence, improved understanding of the impacts of microplastics in the environment. Due to its capability to measure large areas in minutes, its speed, degree of automation and excellent S/N, the LDIR could also implemented for various other samples like food adulteration, coatings, laminates, fabrics, textiles and tissues. This presentation will highlight a few of them and focus on the benefits of the LDIR vs classical techniques.

Keywords: QCL, automation, microplastics, tissues, infrared, speed

Procedia PDF Downloads 34
9 Mixed-Methods Analyses of Subjective Strategies of Most Unlikely but Successful Transitions from Social Benefits to Work

Authors: Hirseland Andreas, Kerschbaumer Lukas

Abstract:

In the case of Germany, there are about one million long-term unemployed – a figure that did not vary much during the past years. These long-term unemployed did not benefit from the prospering labor market while most short-term unemployed did. Instead, they are continuously dependent on welfare and sometimes precarious short-term employment, experiencing work poverty. Long-term unemployment thus turns into a main obstacle to become employed again, especially if it is accompanied by other impediments such as low-level education (school/vocational), poor health (especially chronical illness), advanced age (older than fifty), immigrant status, motherhood or engagement in care for other relatives. As can be shown by this current research project, in these cases the chance to regain employment decreases to near nil. Almost two-thirds of all welfare recipients have multiple impediments which hinder a successful transition from welfare back to sustainable and sufficient employment. Prospective employers are unlikely to hire long-term unemployed with additional impediments because they evaluate potential employees on their negative signaling (e.g. low-level education) and the implicit assumption of unproductiveness (e.g. poor health, age). Some findings of the panel survey “Labor market and social security” (PASS) carried out by the Institute of Employment Research (the research institute of the German Federal Labor Agency) spread a ray of hope, showing that unlikely does not necessarily mean impossible. The presentation reports on current research on these very scarce “success stories” of unlikely transitions from long-term unemployment to work and how these cases were able to perform this switch against all odds. The study is based on a mixed-method design. Within the panel survey (~15,000 respondents in ~10,000 households), only 66 cases of such unlikely transitions were observed. These cases have been explored by qualitative inquiry – in depth-interviews and qualitative network techniques. There is strong evidence that sustainable transitions are influenced by certain biographical resources like habits of network use, a set of informal skills and particularly a resilient way of dealing with obstacles, combined with contextual factors rather than by job-placement procedures promoted by Job-Centers according to activation rules or by following formal paths of application. On the employer’s side small and medium-sized enterprises are often found to give job opportunities to a wider variety of applicants, often based on a slow but steadily increasing relationship leading to employment. According to these results it is possible to show and discuss some limitations of (German) activation policies targeting the labor market and their impact on welfare dependency and long-term unemployment. Based on these findings, indications for more supportive small-scale measures in the field of labor-market policies are suggested to help long-term unemployed with multiple impediments to overcome their situation (e.g. organizing small-scale-structures and low-threshold services to encounter possible employers on a more informal basis like “meet and greet”).

Keywords: against-all-odds, mixed-methods, Welfare State, long-term unemployment

Procedia PDF Downloads 341
8 Company-Independent Standardization of Timber Construction to Promote Urban Redensification of Housing Stock

Authors: Andreas Schweiger, Matthias Gnigler, Elisabeth Wieder, Michael Grobbauer

Abstract:

Especially in the alpine region, available areas for new residential development are limited. One possible solution is to exploit the potential of existing settlements. Urban redensification, especially the addition of floors to existing buildings, requires efficient, lightweight constructions with short construction times. This topic is being addressed in the five-year Alpine Building Centre. The focus of this cooperation between Salzburg University of Applied Sciences and RSA GH Studio iSPACE is on transdisciplinary research in the fields of building and energy technology, building envelopes and geoinformation, as well as the transfer of research results to industry. One development objective is a system of wood panel system construction with a high degree of prefabrication to optimize the construction quality, the construction time and the applicability for small and medium-sized enterprises. The system serves as a reliable working basis for mastering the complex building task of redensification. The technical solution is the development of an open system in timber frame and solid wood construction, which is suitable for a maximum two-story addition of residential buildings. The applicability of the system is mainly influenced by the existing building stock. Therefore, timber frame and solid timber construction are combined where necessary to bridge large spans of the existing structure while keeping the dead weight as low as possible. Escape routes are usually constructed in reinforced concrete and are located outside the system boundary. Thus, within the framework of the legal and normative requirements of timber construction, a hybrid construction method for redensification created. Component structure, load-bearing structure and detail constructions are developed in accordance with the relevant requirements. The results are directly applicable in individual cases, with the exception of the required verifications. In order to verify the practical suitability of the developed system, stakeholder workshops are held on the one hand, and the system is applied in the planning of a two-storey extension on the other hand. A company-independent construction standard offers the possibility of cooperation and bundling of capacities in order to be able to handle larger construction volumes in collaboration with several companies. Numerous further developments can take place on the basis of the system, which is under open license. The construction system will support planners and contractors from design to execution. In this context, open means publicly published and freely usable and modifiable for own use as long as the authorship and deviations are mentioned. The companies are provided with a system manual, which contains the system description and an application manual. This manual will facilitate the selection of the correct component cross-sections for the specific construction projects by means of all component and detail specifications. This presentation highlights the initial situation, the motivation, the approach, but especially the technical solution as well as the possibilities for the application. After an explanation of the objectives and working methods, the component and detail specifications are presented as work results and their application.

Keywords: redensification, SME, urban development, wood building system

Procedia PDF Downloads 75
7 Critiquing Israel as Child Abuse: How Colonial White Feminism Disrupts Critical Pedagogies of Culturally Responsive and Relevant Practices and Inclusion through Ongoing and Historical Maternalism and Neoliberal Settler Colonialism

Authors: Wafaa Hasan

Abstract:

In May of 2022, Palestinian parents in Toronto, Canada, became aware that educators and staff in the Toronto District School Board were attempting to include the International Holocaust and Remembrance Definition of Antisemitism (IHRA) in The Child Abuse and Neglect Policy of the largest school board in Canada, The Toronto District School Board (TDSB). The idea was that if students were to express any form of antisemitism, as defined by the IHRA, then an investigation could follow with Child Protective Services (CPS). That is, the student’s parents could be reported to the state and investigated for custodial rights to their children. The TDSB has set apparent goals for “Decolonizing Pedagogy” (“TDSB Equity Leadership Competencies”), Culturally Responsive and Relevant Practices (CRRP) and inclusive education. These goals promote the centering of colonized, racialized and marginalized voices. CRRP cannot be effective without the application of anti-racist and settler colonial analyses. In order for CRRP to be effective, school boards need a comprehensive understanding of the ways in which the vilification of Palestinians operates through anti-indigenous and white supremacist systems and logic. Otherwise, their inclusion will always be in tension with the inclusion of settler colonial agendas and worldviews. Feminist maternalism frames racial mothering as degenerate (viewing the contributions of racialized students and their parents as products of primitive and violent cultures) and also indirectly inhibits the actualization of the tenets of CRRP and inclusive education through its extensions into the welfare state and public education. The contradiction between the tenets of CRRP and settler colonial systems of erasure and repression is resolved by the continuation of tactics to 1) force assimilation, 2) punish those who push back on that assimilation and 3) literally fragment familial and community structures of racialized students, educators and parents. This paper draws on interdisciplinary (history, philosophy, anthropology) critiques of white feminist “maternalism” from the 19th century onwards in North America and Europe (Jacobs, Weber), as well as “anti-racist education” theory (Dei), and more specifically,” culturally responsive learning,” (Muhammad) and “bandwidth” pedagogy theory (Verschelden) to make its claims. This research contributes to vibrant debates about anti-racist and decolonial pedagogies in public education systems globally. This paper also documents first-hand interviews and experiences of diasporic Palestinian mothers and motherhoods and situates their experiences within longstanding histories of white feminist maternalist (and eugenicist) politics. This informal qualitative data from "participatory conversations" (Swain) is situated within a set of formal interview data collected with Palestinian women in the West Bank (approved by the McMaster University Humanities Research Ethics Board) relating to white feminist maternalism in the peace and dialogue industry.

Keywords: decolonial feminism, maternal feminism, anti-racist pedagogies, settler colonial studies, motherhood studies, pedagogy theory, cultural theory

Procedia PDF Downloads 39
6 Simulation of Multistage Extraction Process of Co-Ni Separation Using Ionic Liquids

Authors: Hongyan Chen, Megan Jobson, Andrew J. Masters, Maria Gonzalez-Miquel, Simon Halstead, Mayri Diaz de Rienzo

Abstract:

Ionic liquids offer excellent advantages over conventional solvents for industrial extraction of metals from aqueous solutions, where such extraction processes bring opportunities for recovery, reuse, and recycling of valuable resources and more sustainable production pathways. Recent research on the use of ionic liquids for extraction confirms their high selectivity and low volatility, but there is relatively little focus on how their properties can be best exploited in practice. This work addresses gaps in research on process modelling and simulation, to support development, design, and optimisation of these processes, focusing on the separation of the highly similar transition metals, cobalt, and nickel. The study exploits published experimental results, as well as new experimental results, relating to the separation of Co and Ni using trihexyl (tetradecyl) phosphonium chloride. This extraction agent is attractive because it is cheaper, more stable and less toxic than fluorinated hydrophobic ionic liquids. This process modelling work concerns selection and/or development of suitable models for the physical properties, distribution coefficients, for mass transfer phenomena, of the extractor unit and of the multi-stage extraction flowsheet. The distribution coefficient model for cobalt and HCl represents an anion exchange mechanism, supported by the literature and COSMO-RS calculations. Parameters of the distribution coefficient models are estimated by fitting the model to published experimental extraction equilibrium results. The mass transfer model applies Newman’s hard sphere model. Diffusion coefficients in the aqueous phase are obtained from the literature, while diffusion coefficients in the ionic liquid phase are fitted to dynamic experimental results. The mass transfer area is calculated from the surface to mean diameter of liquid droplets of the dispersed phase, estimated from the Weber number inside the extractor. New experiments measure the interfacial tension between the aqueous and ionic phases. The empirical models for predicting the density and viscosity of solutions under different metal loadings are also fitted to new experimental data. The extractor is modelled as a continuous stirred tank reactor with mass transfer between the two phases and perfect phase separation of the outlet flows. A multistage separation flowsheet simulation is set up to replicate a published experiment and compare model predictions with the experimental results. This simulation model is implemented in gPROMS software for dynamic process simulation. The results of single stage and multi-stage flowsheet simulations are shown to be in good agreement with the published experimental results. The estimated diffusion coefficient of cobalt in the ionic liquid phase is in reasonable agreement with published data for the diffusion coefficients of various metals in this ionic liquid. A sensitivity study with this simulation model demonstrates the usefulness of the models for process design. The simulation approach has potential to be extended to account for other metals, acids, and solvents for process development, design, and optimisation of extraction processes applying ionic liquids for metals separations, although a lack of experimental data is currently limiting the accuracy of models within the whole framework. Future work will focus on process development more generally and on extractive separation of rare earths using ionic liquids.

Keywords: distribution coefficient, mass transfer, COSMO-RS, flowsheet simulation, phosphonium

Procedia PDF Downloads 154
5 Investigation of Chemical Effects on the Lγ2,3 and Lγ4 X-ray Production Cross Sections for Some Compounds of 66dy at Photon Energies Close to L1 Absorption-edge Energy

Authors: Anil Kumar, Rajnish Kaur, Mateusz Czyzycki, Alessandro Migilori, Andreas Germanos Karydas, Sanjiv Puri

Abstract:

The radiative decay of Li(i=1-3) sub-shell vacancies produced through photoionization results in production of the characteristic emission spectrum comprising several X-ray lines, whereas non-radiative vacancy decay results in Auger electron spectrum. Accurate reliable data on the Li(i=1-3) sub-shell X-ray production (XRP) cross sections is of considerable importance for investigation of atomic inner-shell ionization processes as well as for quantitative elemental analysis of different types of samples employing the energy dispersive X-ray fluorescence (EDXRF) analysis technique. At incident photon energies in vicinity of the absorption edge energies of an element, the many body effects including the electron correlation, core relaxation, inter-channel coupling and post-collision interactions become significant in the photoionization of atomic inner-shells. Further, in case of compounds, the characteristic emission spectrum of the specific element is expected to get influenced by the chemical environment (coordination number, oxidation state, nature of ligand/functional groups attached to central atom, etc.). These chemical effects on L X-ray fluorescence parameters have been investigated by performing the measurements at incident photon energies much higher than the Li(i=1-3) sub-shell absorption edge energies using EDXRF spectrometers. In the present work, the cross sections for production of the Lk(k= γ2,3, γ4) X-rays have been measured for some compounds of 66Dy, namely, Dy2O3, Dy2(CO3)3, Dy2(SO4)3.8H2O, DyI2 and Dy metal by tuning the incident photon energies few eV above the L1 absorption-edge energy in order to investigate the influence of chemical effects on these cross sections in presence of the many body effects which become significant at photon energies close to the absorption-edge energies. The present measurements have been performed under vacuum at the IAEA end-station of the X-ray fluorescence beam line (10.1L) of ELETTRA synchrotron radiation facility (Trieste, Italy) using self-supporting pressed pellet targets (1.3 cm diameter, nominal thicknesses ~ 176 mg/cm2) of 66Dy compounds (procured from Sigma Aldrich) and a metallic foil of 66Dy (nominal thickness ~ 3.9 mg/cm2, procured from Good Fellow, UK). The present measured cross sections have been compared with theoretical values calculated using the Dirac-Hartree-Slater(DHS) model based fluorescence and Coster-Kronig yields, Dirac-Fock(DF) model based X-ray emission rates and two sets of L1 sub-shell photoionization cross sections based on the non-relativistic Hartree-Fock-Slater(HFS) model and those deduced from the self-consistent Dirac-Hartree-Fock(DHF) model based total photoionization cross sections. The present measured XRP cross sections for 66Dy as well as for its compounds for the L2,3 and L4 X-rays, are found to be higher by ~14-36% than the two calculated set values. It is worth to be mentioned that L2,3 and L4 X-ray lines are originated by filling up of the L1 sub-shell vacancies by the outer sub-shell (N2,3 and O2,3) electrons which are much more sensitive to the chemical environment around the central atom. The present observed differences between measured and theoretical values are expected due to combined influence of the many-body effects and the chemical effects.

Keywords: chemical effects, L X-ray production cross sections, Many body effects, Synchrotron radiation

Procedia PDF Downloads 108
4 Risks for Cyanobacteria Harmful Algal Blooms in Georgia Piedmont Waterbodies Due to Land Management and Climate Interactions

Authors: Sam Weber, Deepak Mishra, Susan Wilde, Elizabeth Kramer

Abstract:

The frequency and severity of cyanobacteria harmful blooms (CyanoHABs) have been increasing over time, with point and non-point source eutrophication and shifting climate paradigms being blamed as the primary culprits. Excessive nutrients, warm temperatures, quiescent water, and heavy and less regular rainfall create more conducive environments for CyanoHABs. CyanoHABs have the potential to produce a spectrum of toxins that cause gastrointestinal stress, organ failure, and even death in humans and animals. To promote enhanced, proactive CyanoHAB management, risk modeling using geospatial tools can act as predictive mechanisms to supplement current CyanoHAB monitoring, management and mitigation efforts. The risk maps would empower water managers to focus their efforts on high risk water bodies in an attempt to prevent CyanoHABs before they occur, and/or more diligently observe those waterbodies. For this research, exploratory spatial data analysis techniques were used to identify the strongest predicators for CyanoHAB blooms based on remote sensing-derived cyanobacteria cell density values for 771 waterbodies in the Georgia Piedmont and landscape characteristics of their watersheds. In-situ datasets for cyanobacteria cell density, nutrients, temperature, and rainfall patterns are not widely available, so free gridded geospatial datasets were used as proxy variables for assessing CyanoHAB risk. For example, the percent of a watershed that is agriculture was used as a proxy for nutrient loading, and the summer precipitation within a watershed was used as a proxy for water quiescence. Cyanobacteria cell density values were calculated using atmospherically corrected images from the European Space Agency’s Sentinel-2A satellite and multispectral instrument sensor at a 10-meter ground resolution. Seventeen explanatory variables were calculated for each watershed utilizing the multi-petabyte geospatial catalogs available within the Google Earth Engine cloud computing interface. The seventeen variables were then used in a multiple linear regression model, and the strongest predictors of cyanobacteria cell density were selected for the final regression model. The seventeen explanatory variables included land cover composition, winter and summer temperature and precipitation data, topographic derivatives, vegetation index anomalies, and soil characteristics. Watershed maximum summer temperature, percent agriculture, percent forest, percent impervious, and waterbody area emerged as the strongest predictors of cyanobacteria cell density with an adjusted R-squared value of 0.31 and a p-value ~ 0. The final regression equation was used to make a normalized cyanobacteria cell density index, and a Jenks Natural Break classification was used to assign waterbodies designations of low, medium, or high risk. Of the 771 waterbodies, 24.38% were low risk, 37.35% were medium risk, and 38.26% were high risk. This study showed that there are significant relationships between free geospatial datasets representing summer maximum temperatures, nutrient loading associated with land use and land cover, and the area of a waterbody with cyanobacteria cell density. This data analytics approach to CyanoHAB risk assessment corroborated the literature-established environmental triggers for CyanoHABs, and presents a novel approach for CyanoHAB risk mapping in waterbodies across the greater southeastern United States.

Keywords: cyanobacteria, land use/land cover, remote sensing, risk mapping

Procedia PDF Downloads 183
3 Case Study Hyperbaric Oxygen Therapy for Idiopathic Sudden Sensorineural Hearing Loss

Authors: Magdy I. A. Alshourbagi

Abstract:

Background: The National Institute for Deafness and Communication Disorders defines idiopathic sudden sensorineural hearing loss as the idiopathic loss of hearing of at least 30 dB across 3 contiguous frequencies occurring within 3 days.The most common clinical presentation involves an individual experiencing a sudden unilateral hearing loss, tinnitus, a sensation of aural fullness and vertigo. The etiologies and pathologies of ISSNHL remain unclear. Several pathophysiological mechanisms have been described including: vascular occlusion, viral infections, labyrinthine membrane breaks, immune associated disease, abnormal cochlear stress response, trauma, abnormal tissue growth, toxins, ototoxic drugs and cochlear membrane damage. The rationale for the use of hyperbaric oxygen to treat ISSHL is supported by an understanding of the high metabolism and paucity of vascularity to the cochlea. The cochlea and the structures within it require a high oxygen supply. The direct vascular supply, particularly to the organ of Corti, is minimal. Tissue oxygenation to the structures within the cochlea occurs via oxygen diffusion from cochlear capillary networks into the perilymph and the cortilymph. . The perilymph is the primary oxygen source for these intracochlear structures. Unfortunately, perilymph oxygen tension is decreased significantly in patients with ISSHL. To achieve a consistent rise of perilymph oxygen content, the arterial-perilymphatic oxygen concentration difference must be extremely high. This can be restored with hyperbaric oxygen therapy. Subject and Methods: A 37 year old man was presented at the clinic with a five days history of muffled hearing and tinnitus of the right ear. Symptoms were sudden onset, with no associated pain, dizziness or otorrhea and no past history of hearing problems or medical illness. Family history was negative. Physical examination was normal. Otologic examination revealed normal tympanic membranes bilaterally, with no evidence of cerumen or middle ear effusion. Tuning fork examination showed positive Rinne test bilaterally but with lateralization of Weber test to the left side, indicating right ear sensorineural hearing loss. Audiometric analysis confirmed sensorineural hearing loss across all frequencies of about 70- dB in the right ear. Routine lab work were all within normal limits. Clinical diagnosis of idiopathic sudden sensorineural hearing loss of the right ear was made and the patient began a medical treatment (corticosteroid, vasodilator and HBO therapy). The recommended treatment profile consists of 100% O2 at 2.5 atmospheres absolute for 60 minutes daily (six days per week) for 40 treatments .The optimal number of HBOT treatments will vary, depending on the severity and duration of symptomatology and the response to treatment. Results: As HBOT is not yet a standard for idiopathic sudden sensorineural hearing loss, it was introduced to this patient as an adjuvant therapy. The HBOT program was scheduled for 40 sessions, we used a 12-seat multi place chamber for the HBOT, which was started at day seven after the hearing loss onset. After the tenth session of HBOT, improvement of both hearing (by audiogram) and tinnitus was obtained in the affected ear (right). Conclusions: In conclusion, HBOT may be used for idiopathic sudden sensorineural hearing loss as an adjuvant therapy. It may promote oxygenation to the inner ear apparatus and revive hearing ability. Patients who fail to respond to oral and intratympanic steroids may benefit from this treatment. Further investigation is warranted, including animal studies to understand the molecular and histopathological aspects of HBOT and randomized control clinical studies.

Keywords: idiopathic sudden sensorineural hearing loss (issnhl), hyperbaric oxygen therapy (hbot), the decibel (db), oxygen (o2)

Procedia PDF Downloads 401
2 Potential Benefits and Adaptation of Climate Smart Practices by Small Farmers Under Three-Crop Rice Production System in Vietnam

Authors: Azeem Tariq, Stephane De Tourdonnet, Lars Stoumann Jensen, Reiner Wassmann, Bjoern Ole Sander, Quynh Duong Vu, Trinh Van Mai, Andreas De Neergaard

Abstract:

Rice growing area is increasing to meet the food demand of increasing population. Mostly, rice is growing on lowland, small landholder fields in most part of the world, which is one of the major sources of greenhouse gases (GHG) emissions from agriculture fields. The strategies such as, altering water and residues (carbon) management practices are assumed to be essential to mitigate the GHG emissions from flooded rice system. The actual implementation and potential of these measures on small farmer fields is still challenging. A field study was conducted on red river delta in Northern Vietnam to identify the potential challenges and barriers to the small rice farmers for implementation of climate smart rice practices. The objective of this study was to develop and access the feasibility of climate smart rice prototypes under actual farmer conditions. Field and scientific oriented framework was used to meet our objective. The methodological framework composed of six steps: i) identification of stakeholders and possible options, ii) assessment of barrios, drawbacks/advantages of new technologies, iii) prototype design, iv) assessment of mitigation potential of each prototype, v) scenario building and vi) scenario assessment. A farm survey was conducted to identify the existing farm practices and major constraints of small rice farmers. We proposed the two water (pre transplant+midseason drainage and early+midseason drainage) and one straw (full residue incorporation) management option keeping in views the farmers constraints and barriers for implementation. To test new typologies with existing prototypes (midseason drainage, partial residue incorporation) at farmer local conditions, a participatory field experiment was conducted for two consecutive rice seasons at farmer fields. Following the results of each season a workshop was conducted with stakeholders (farmers, village leaders, cooperatives, irrigation staff, extensionists, agricultural officers) at local and district level to get feedbacks on new tested prototypes and to develop possible scenarios for climate smart rice production practices. The farm analysis survey showed that non-availability of cheap labor and lacks of alternatives for straw management influence the small farmers to burn the residues in the fields except to use for composting or other purposes. Our field results revealed that application of early season drainage significantly mitigates (40-60%) the methane emissions from residue incorporation. Early season drainage was more efficient and easy to control under cooperate manage system than individually managed water system, and it leads to both economic (9-11% high rice yield, low cost of production, reduced nutrient loses) and environmental (mitigate methane emissions) benefits. The participatory field study allows the assessment of adaptation potential and possible benefits of climate smart practices on small farmer fields. If farmers have no other residue management option, full residue incorporation with early plus midseason drainage is adaptable and beneficial (both environmentally and economically) management option for small rice farmers.

Keywords: adaptation, climate smart agriculture, constrainsts, smallholders

Procedia PDF Downloads 239
1 Multimodal Integration of EEG, fMRI and Positron Emission Tomography Data Using Principal Component Analysis for Prognosis in Coma Patients

Authors: Denis Jordan, Daniel Golkowski, Mathias Lukas, Katharina Merz, Caroline Mlynarcik, Max Maurer, Valentin Riedl, Stefan Foerster, Eberhard F. Kochs, Andreas Bender, Ruediger Ilg

Abstract:

Introduction: So far, clinical assessments that rely on behavioral responses to differentiate coma states or even predict outcome in coma patients are unreliable, e.g. because of some patients’ motor disabilities. The present study was aimed to provide prognosis in coma patients using markers from electroencephalogram (EEG), blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI) and [18F]-fluorodeoxyglucose (FDG) positron emission tomography (PET). Unsuperwised principal component analysis (PCA) was used for multimodal integration of markers. Methods: Approved by the local ethics committee of the Technical University of Munich (Germany) 20 patients (aged 18-89) with severe brain damage were acquired through intensive care units at the Klinikum rechts der Isar in Munich and at the Therapiezentrum Burgau (Germany). At the day of EEG/fMRI/PET measurement (date I) patients (<3.5 month in coma) were grouped in the minimal conscious state (MCS) or vegetative state (VS) on the basis of their clinical presentation (coma recovery scale-revised, CRS-R). Follow-up assessment (date II) was also based on CRS-R in a period of 8 to 24 month after date I. At date I, 63 channel EEG (Brain Products, Gilching, Germany) was recorded outside the scanner, and subsequently simultaneous FDG-PET/fMRI was acquired on an integrated Siemens Biograph mMR 3T scanner (Siemens Healthineers, Erlangen Germany). Power spectral densities, permutation entropy (PE) and symbolic transfer entropy (STE) were calculated in/between frontal, temporal, parietal and occipital EEG channels. PE and STE are based on symbolic time series analysis and were already introduced as robust markers separating wakefulness from unconsciousness in EEG during general anesthesia. While PE quantifies the regularity structure of the neighboring order of signal values (a surrogate of cortical information processing), STE reflects information transfer between two signals (a surrogate of directed connectivity in cortical networks). fMRI was carried out using SPM12 (Wellcome Trust Center for Neuroimaging, University of London, UK). Functional images were realigned, segmented, normalized and smoothed. PET was acquired for 45 minutes in list-mode. For absolute quantification of brain’s glucose consumption rate in FDG-PET, kinetic modelling was performed with Patlak’s plot method. BOLD signal intensity in fMRI and glucose uptake in PET was calculated in 8 distinct cortical areas. PCA was performed over all markers from EEG/fMRI/PET. Prognosis (persistent VS and deceased patients vs. recovery to MCS/awake from date I to date II) was evaluated using the area under the curve (AUC) including bootstrap confidence intervals (CI, *: p<0.05). Results: Prognosis was reliably indicated by the first component of PCA (AUC=0.99*, CI=0.92-1.00) showing a higher AUC when compared to the best single markers (EEG: AUC<0.96*, fMRI: AUC<0.86*, PET: AUC<0.60). CRS-R did not show prediction (AUC=0.51, CI=0.29-0.78). Conclusion: In a multimodal analysis of EEG/fMRI/PET in coma patients, PCA lead to a reliable prognosis. The impact of this result is evident, as clinical estimates of prognosis are inapt at time and could be supported by quantitative biomarkers from EEG, fMRI and PET. Due to the small sample size, further investigations are required, in particular allowing superwised learning instead of the basic approach of unsuperwised PCA.

Keywords: coma states and prognosis, electroencephalogram, entropy, functional magnetic resonance imaging, machine learning, positron emission tomography, principal component analysis

Procedia PDF Downloads 310