Search results for: Pablo Martin
141 Social Protection Reforms in Indonesia: Towards a Life Cycle Based Social Protection System
Authors: Dyah Larasati, Karishma Alize Huda, Sri Kusumastuti Rahayu, Martin Daniel Siyaranamual
Abstract:
Indonesia continues to reform its social protection system to provide the needed protection for its citizen. Indonesia Social Protection consisted of social assistance programs (non-contributory/tax-financed) specifically targeted for the poor and at-risk and social security/insurance program (contributory system). The social assistance programs have mostly been implemented since 1998. The national health insurance has been implemented since 2014 and the employment social insurance since 2015. One major reform implemented has been improving the targeting performance of its major social assistance portfolios including (1) Food Assistance for the poor families (Rastra and BPNT/noncash foods assistance); (2) Education Assistance for poor children; (3) Conditional Cash Transfer for poor families (PKH); and (4) Subsidized beneficiaries of National Health Insurance (JKN-PBI) for the poor and at-risk individuals. For the Social Insurance (through BPJS Employment program), several initiatives have been implemented to expand the program contributing members, although it mostly benefits the formal sector workers. However, major gaps still exist especially for the emerging middle-income groups who typically work at the informal sectors. They have yet to get the protection needed to sustain their social and economic growth. Since 2017, TNP2K (the National Team for Poverty Reduction) under the Vice President office has led the social protection discourse as the government understands the need to address vulnerabilities across the lifecycle and prioritize support to the most at-risk population particularly the elderly, young children and people with disabilities. Discussion and advocacy to recommend for more investment is continuing in order for the government to establish a comprehensive social protection system in the near future (2020-2024) that protects children through an inclusive child benefit program; build a system to benefit more working-age adults (including individuals with disabilities) and a three-tier elderly protection as they reach 65 years.Keywords: poverty reduction, social assistance, social insurance, social protection
Procedia PDF Downloads 179140 Gender Justice and Feminist Self-Management Practices in the Solidarity Economy: A Quantitative Analysis of the Factors that Impact Enterprises Formed by Women in Brazil
Authors: Maria de Nazaré Moraes Soares, Silvia Maria Dias Pedro Rebouças, José Carlos Lázaro
Abstract:
The Solidarity Economy (SE) acts in the re-articulation of the economic field to the other spheres of social action. The significant participation of women in SE resulted in the formation of a national network of self-managed enterprises in Brazil: The Solidarity and Feminist Economy Network (SFEN). The objective of the research is to identify factors of gender justice and feminist self-management practices that adhere to the reality of women in SE enterprises. The conceptual apparatus related to feminist studies in this research covers Nancy Fraser approaches on gender justice, and Patricia Yancey Martin approaches on feminist management practices, and authors of postcolonial feminism such as Mohanty and Maria Lugones, who lead the discussion to peripheral contexts, a necessary perspective when observing the women’s movement in SE. The research has a quantitative nature in the phases of data collection and analysis. The data collection was performed through two data sources: the database mapped in Brazil in 2010-2013 by the National Information System in Solidary Economy and 150 questionnaires with women from 16 enterprises in SFEN, in a state of Brazilian northeast. The data were analyzed using the multivariate statistical technique of Factor Analysis. The results show that the factors that define gender justice and feminist self-management practices in SE are interrelated in several levels, proving statistically the intersectional condition of the issue of women. The evidence from the quantitative analysis allowed us to understand the dimensions of gender justice and feminist management practices intersectionality; in this sense, the non-distribution of domestic work interferes in non-representation of women in public spaces, especially in peripheral contexts. The study contributes with important reflections to the studies of this area and can be complemented in the future with a qualitative research that approaches the perspective of women in the context of the SE self-management paradigm.Keywords: feminist management practices, gender justice, self-management, solidarity economy
Procedia PDF Downloads 129139 From Avatars to Humans: A Hybrid World Theory and Human Computer Interaction Experimentations with Virtual Reality Technologies
Authors: Juan Pablo Bertuzzi, Mauro Chiarella
Abstract:
Employing a communication studies perspective and a socio-technological approach, this paper introduces a theoretical framework for understanding the concept of hybrid world; the avatarization phenomena; and the communicational archetype of co-hybridization. This analysis intends to make a contribution to future design of virtual reality experimental applications. Ultimately, this paper presents an ongoing research project that proposes the study of human-avatar interactions in digital educational environments, as well as an innovative reflection on inner digital communication. The aforementioned project presents the analysis of human-avatar interactions, through the development of an interactive experience in virtual reality. The goal is to generate an innovative communicational dimension that could reinforce the hypotheses presented throughout this paper. Being thought for its initial application in educational environments, the analysis and results of this research are dependent and have been prepared in regard of a meticulous planning of: the conception of a 3D digital platform; the interactive game objects; the AI or computer avatars; the human representation as hybrid avatars; and lastly, the potential of immersion, ergonomics and control diversity that can provide the virtual reality system and the game engine that were chosen. The project is divided in two main axes: The first part is the structural one, as it is mandatory for the construction of an original prototype. The 3D model is inspired by the physical space that belongs to an academic institution. The incorporation of smart objects, avatars, game mechanics, game objects, and a dialogue system will be part of the prototype. These elements have all the objective of gamifying the educational environment. To generate a continuous participation and a large amount of interactions, the digital world will be navigable both, in a conventional device and in a virtual reality system. This decision is made, practically, to facilitate the communication between students and teachers; and strategically, because it will help to a faster population of the digital environment. The second part is concentrated to content production and further data analysis. The challenge is to offer a scenario’s diversity that compels users to interact and to question their digital embodiment. The multipath narrative content that is being applied is focused on the subjects covered in this paper. Furthermore, the experience with virtual reality devices proposes users to experiment in a mixture of a seemingly infinite digital world and a small physical area of movement. This combination will lead the narrative content and it will be crucial in order to restrict user’s interactions. The main point is to stimulate and to grow in the user the need of his hybrid avatar’s help. By building an inner communication between user’s physicality and user’s digital extension, the interactions will serve as a self-guide through the gameworld. This is the first attempt to make explicit the avatarization phenomena and to further analyze the communicational archetype of co-hybridization. The challenge of the upcoming years will be to take advantage from these forms of generalized avatarization, in order to create awareness and establish innovative forms of hybridization.Keywords: avatar, hybrid worlds, socio-technology, virtual reality
Procedia PDF Downloads 142138 Micromechanical Compatibility Between Cells and Scaffold Mediates the Efficacy of Regenerative Medicine
Authors: Li Yang, Yang Song, Martin Y. M. Chiang
Abstract:
Objective: To experimentally substantiate the micromechanical compatibility between cell and scaffold, in the regenerative medicine approach for restoring bone volume, is essential for phenotypic transitions Methods: Through nanotechnology and electrospinning process, nanofibrous scaffolds were fabricated to host dental follicle stem cells (DFSCs). Blends (50:50) of polycaprolactone (PCL) and silk fibroin (SF), mixed with various content of cellulose nanocrystals (CNC, up to 5% in weight), were electrospun to prepare nanofibrous scaffolds with heterogeneous microstructure in terms of fiber size. Colloidal probe atomic force microscopy (AFM) and conventional uniaxial tensile tests measured the scaffold stiffness at the micro-and macro-scale, respectively. The cell elastic modulus and cell-scaffold adhesive interaction (i.e., a chemical function) were examined through single-cell force spectroscopy using AFM. The quantitative reverse transcription-polymerase chain reaction (qRT-PCR) was used to determine if the mechanotransduction signal (i.e., Yap1, Wwr2, Rac1, MAPK8, Ptk2 and Wnt5a) is upregulated by the scaffold stiffness at the micro-scale (cellular scale). Results: The presence of CNC produces fibrous scaffolds with a bimodal distribution of fiber diameter. This structural heterogeneity, which is CNC-composition dependent, remarkably modulates the mechanical functionality of scaffolds at microscale and macroscale simultaneously, but not the chemical functionality (i.e., only a single material property is varied). In in vitro tests, the osteogenic differentiation and gene expression associated with mechano-sensitive cell markers correlate to the degree of micromechanical compatibility between DFSCs and the scaffold. Conclusion: Cells require compliant scaffolds to encourage energetically favorable interactions for mechanotransduction, which are converted into changes in cellular biochemistry to direct the phenotypic evolution. The micromechanical compatibility is indeed important to the efficacy of regenerative medicine.Keywords: phenotype transition, scaffold stiffness, electrospinning, cellulose nanocrystals, single-cell force spectroscopy
Procedia PDF Downloads 190137 Determination of Influence Lines for Train Crossings on a Tied Arch Bridge to Optimize the Construction of the Hangers
Authors: Martin Mensinger, Marjolaine Pfaffinger, Matthias Haslbeck
Abstract:
The maintenance and expansion of the railway network represents a central task for transport planning in the future. In addition to the ultimate limit states, the aspects of resource conservation and sustainability are increasingly more necessary to include in the basic engineering. Therefore, as part of the AiF research project, ‘Integrated assessment of steel and composite railway bridges in accordance with sustainability criteria’, the entire lifecycle of engineering structures is involved in planning and evaluation, offering a way to optimize the design of steel bridges. In order to reduce the life cycle costs and increase the profitability of steel structures, it is particularly necessary to consider the demands on hanger connections resulting from fatigue. In order for accurate analysis, a number simulations were conducted as part of the research project on a finite element model of a reference bridge, which gives an indication of the internal forces of the individual structural components of a tied arch bridge, depending on the stress incurred by various types of trains. The calculations were carried out on a detailed FE-model, which allows an extraordinarily accurate modeling of the stiffness of all parts of the constructions as it is made up surface elements. The results point to a large impact of the formation of details on fatigue-related changes in stress, on the one hand, and on the other, they could depict construction-specific specifics over the course of adding stress. Comparative calculations with varied axle-stress distribution also provide information about the sensitivity of the results compared to the imposition of stress and axel distribution on the stress-resultant development. The calculated diagrams help to achieve an optimized hanger connection design through improved durability, which helps to reduce the maintenance costs of rail networks and to give practical application notes for the formation of details.Keywords: fatigue, influence line, life cycle, tied arch bridge
Procedia PDF Downloads 330136 Biodegradation of Endoxifen in Wastewater: Isolation and Identification of Bacteria Degraders, Kinetics, and By-Products
Authors: Marina Arino Martin, John McEvoy, Eakalak Khan
Abstract:
Endoxifen is an active metabolite responsible for the effectiveness of tamoxifen, a chemotherapeutic drug widely used for endocrine responsive breast cancer and chemo-preventive long-term treatment. Tamoxifen and endoxifen are not completely metabolized in human body and are actively excreted. As a result, they are released to the water environment via wastewater treatment plants (WWTPs). The presence of tamoxifen in the environment produces negative effects on aquatic lives due to its antiestrogenic activity. Because endoxifen is 30-100 times more potent than tamoxifen itself and also presents antiestrogenic activity, its presence in the water environment could result in even more toxic effects on aquatic lives compared to tamoxifen. Data on actual concentrations of endoxifen in the environment is limited due to recent discovery of endoxifen pharmaceutical activity. However, endoxifen has been detected in hospital and municipal wastewater effluents. The detection of endoxifen in wastewater effluents questions the treatment efficiency of WWTPs. Studies reporting information about endoxifen removal in WWTPs are also scarce. There was a study that used chlorination to eliminate endoxifen in wastewater. However, an inefficient degradation of endoxifen by chlorination and the production of hazardous disinfection by-products were observed. Therefore, there is a need to remove endoxifen from wastewater prior to chlorination in order to reduce the potential release of endoxifen into the environment and its possible effects. The aim of this research is to isolate and identify bacteria strain(s) capable of degrading endoxifen into less hazardous compound(s). For this purpose, bacteria strains from WWTPs were exposed to endoxifen as a sole carbon and nitrogen source for 40 days. Bacteria presenting positive growth were isolated and tested for endoxifen biodegradation. Endoxifen concentration and by-product formation were monitored. The Monod kinetic model was used to determine endoxifen biodegradation rate. Preliminary results of the study suggest that isolated bacteria from WWTPs are able to growth in presence of endoxifen as a sole carbon and nitrogen source. Ongoing work includes identification of these bacteria strains and by-product(s) of endoxifen biodegradation.Keywords: biodegradation, bacterial degraders, endoxifen, wastewater
Procedia PDF Downloads 215135 Research on the Conservation Strategy of Territorial Landscape Based on Characteristics: The Case of Fujian, China
Authors: Tingting Huang, Sha Li, Geoffrey Griffiths, Martin Lukac, Jianning Zhu
Abstract:
Territorial landscapes have experienced a gradual loss of their typical characteristics during long-term human activities. In order to protect the integrity of regional landscapes, it is necessary to characterize, evaluate and protect them in a graded manner. The study takes Fujian, China, as an example and classifies the landscape characters of the site at the regional scale, middle scale, and detailed scale. A multi-scale approach combining parametric and holistic approaches is used to classify and partition the landscape character types (LCTs) and landscape character areas (LCAs) at different scales, and a multi-element landscape assessment approach is adopted to explore the conservation strategies of the landscape character. Firstly, multiple fields and multiple elements of geography, nature and humanities were selected as the basis of assessment according to the scales. Secondly, the study takes a parametric approach to the classification and partitioning of landscape character, Principal Component Analysis, and two-stage cluster analysis (K-means and GMM) in MATLAB software to obtain LCTs, combines with Canny Operator Edge Detection Algorithm to obtain landscape character contours and corrects LCTs and LCAs by field survey and manual identification methods. Finally, the study adopts the Landscape Sensitivity Assessment method to perform landscape character conservation analysis and formulates five strategies for different LCAs: conservation, enhancement, restoration, creation, and combination. This multi-scale identification approach can efficiently integrate multiple types of landscape character elements, reduce the difficulty of broad-scale operations in the process of landscape character conservation, and provide a basis for landscape character conservation strategies. Based on the natural background and the restoration of regional characteristics, the results of landscape character assessment are scientific and objective and can provide a strong reference in regional and national scale territorial spatial planning.Keywords: parameterization, multi-scale, landscape character identify, landscape character assessment
Procedia PDF Downloads 99134 A Facile One Step Modification of Poly(dimethylsiloxane) via Smart Polymers for Biomicrofluidics
Authors: A. Aslihan Gokaltun, Martin L. Yarmush, Ayse Asatekin, O. Berk Usta
Abstract:
Poly(dimethylsiloxane) (PDMS) is one of the most widely used materials in the fabrication of microfluidic devices. It is easily patterned and can replicate features down to nanometers. Its flexibility, gas permeability that allows oxygenation, and low cost also drive its wide adoption. However, a major drawback of PDMS is its hydrophobicity and fast hydrophobic recovery after surface hydrophilization. This results in significant non-specific adsorption of proteins as well as small hydrophobic molecules such as therapeutic drugs limiting the utility of PDMS in biomedical microfluidic circuitry. While silicon, glass, and thermoplastics have been used, they come with problems of their own such as rigidity, high cost, and special tooling needs, which limit their use to a smaller user base. Many strategies to alleviate these common problems with PDMS are lack of general practical applicability, or have limited shelf lives in terms of the modifications they achieve. This restricts large scale implementation and adoption by industrial and research communities. Accordingly, we aim to tailor biocompatible PDMS surfaces by developing a simple and one step bulk modification approach with novel smart materials to reduce non-specific molecular adsorption and to stabilize long-term cell analysis with PDMS substrates. Smart polymers that blended with PDMS during device manufacture, spontaneously segregate to surfaces when in contact with aqueous solutions and create a < 1 nm layer that reduces non-specific adsorption of organic and biomolecules. Our methods are fully compatible with existing PDMS device manufacture protocols without any additional processing steps. We have demonstrated that our modified PDMS microfluidic system is effective at blocking the adsorption of proteins while retaining the viability of primary rat hepatocytes and preserving the biocompatibility, oxygen permeability, and transparency of the material. We expect this work will enable the development of fouling-resistant biomedical materials from microfluidics to hospital surfaces and tubing.Keywords: cell culture, microfluidics, non-specific protein adsorption, PDMS, smart polymers
Procedia PDF Downloads 294133 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition
Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can
Abstract:
To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning
Procedia PDF Downloads 85132 Effect of Curing Temperature on the Textural and Rheological of Gelatine-SDS Hydrogels
Authors: Virginia Martin Torrejon, Binjie Wu
Abstract:
Gelatine is a protein biopolymer obtained from the partial hydrolysis of animal tissues which contain collagen, the primary structural component in connective tissue. Gelatine hydrogels have attracted considerable research in recent years as an alternative to synthetic materials due to their outstanding gelling properties, biocompatibility and compostability. Surfactants, such as sodium dodecyl sulfate (SDS), are often used in hydrogels solutions as surface modifiers or solubility enhancers, and their incorporation can influence the hydrogel’s viscoelastic properties and, in turn, its processing and applications. Literature usually focuses on studying the impact of formulation parameters (e.g., gelatine content, gelatine strength, additives incorporation) on gelatine hydrogels properties, but processing parameters, such as curing temperature, are commonly overlooked. For example, some authors have reported a decrease in gel strength at lower curing temperatures, but there is a lack of research on systematic viscoelastic characterisation of high strength gelatine and gelatine-SDS systems at a wide range of curing temperatures. This knowledge is essential to meet and adjust the technological requirements for different applications (e.g., viscosity, setting time, gel strength or melting/gelling temperature). This work investigated the effect of curing temperature (10, 15, 20, 23 and 25 and 30°C) on the elastic modulus (G’) and melting temperature of high strength gelatine-SDS hydrogels, at 10 wt% and 20 wt% gelatine contents, by small-amplitude oscillatory shear rheology coupled with Fourier Transform Infrared Spectroscopy. It also correlates the gel strength obtained by rheological measurements with the gel strength measured by texture analysis. Gelatine and gelatine-SDS hydrogels’ rheological behaviour strongly depended on the curing temperature, and its gel strength and melting temperature can be slightly modified to adjust it to given processing and applications needs. Lower curing temperatures led to gelatine and gelatine-SDS hydrogels with considerably higher storage modulus. However, their melting temperature was lower than those gels cured at higher temperatures and lower gel strength. This effect was more considerable at longer timescales. This behaviour is attributed to the development of thermal-resistant structures in the lower strength gels cured at higher temperatures.Keywords: gelatine gelation kinetics, gelatine-SDS interactions, gelatine-surfactant hydrogels, melting and gelling temperature of gelatine gels, rheology of gelatine hydrogels
Procedia PDF Downloads 101131 Synthesis of Microencapsulated Phase Change Material for Adhesives with Thermoregulating Properties
Authors: Christin Koch, Andreas Winkel, Martin Kahlmeyer, Stefan Böhm
Abstract:
Due to environmental regulations on greenhouse gas emissions and the depletion of fossil fuels, there is an increasing interest in electric vehicles.To maximize their driving range, batteries with high storage capacities are needed. In most electric cars, rechargeable lithium-ion batteries are used because of their high energy density. However, it has to be taken into account that these batteries generate a large amount of heat during the charge and discharge processes. This leads to a decrease in a lifetime and damage to the battery cells when the temperature exceeds the defined operating range. To ensure an efficient performance of the battery cells, reliable thermal management is required. Currently, the cooling is achieved by heat sinks (e.g., cooling plates) bonded to the battery cells with a thermally conductive adhesive (TCA) that directs the heat away from the components. Especially when large amounts of heat have to be dissipated spontaneously due to peak loads, the principle of heat conduction is not sufficient, so attention must be paid to the mechanism of heat storage. An efficient method to store thermal energy is the use of phase change materials (PCM). Through an isothermal phase change, PCM can briefly absorb or release thermal energy at a constant temperature. If the phase change takes place in the transition from solid to liquid, heat is stored during melting and is released to the ambient during the freezing process upon cooling. The presented work displays the great potential of thermally conductive adhesives filled with microencapsulated PCM to limit peak temperatures in battery systems. The encapsulation of the PCM avoids the effects of aging (e.g., migration) and chemical reactions between the PCM and the adhesive matrix components. In this study, microencapsulation has been carried out by in situ polymerization. The microencapsulated PCM was characterized by FT-IR spectroscopy, and the thermal properties were measured by DSC and laser flash method. The mechanical properties, electrical and thermal conductivity, and adhesive toughness of the TCA/PCM composite were also investigated.Keywords: phase change material, microencapsulation, adhesive bonding, thermal management
Procedia PDF Downloads 72130 A Xenon Mass Gauging through Heat Transfer Modeling for Electric Propulsion Thrusters
Authors: A. Soria-Salinas, M.-P. Zorzano, J. Martín-Torres, J. Sánchez-García-Casarrubios, J.-L. Pérez-Díaz, A. Vakkada-Ramachandran
Abstract:
The current state-of-the-art methods of mass gauging of Electric Propulsion (EP) propellants in microgravity conditions rely on external measurements that are taken at the surface of the tank. The tanks are operated under a constant thermal duty cycle to store the propellant within a pre-defined temperature and pressure range. We demonstrate using computational fluid dynamics (CFD) simulations that the heat-transfer within the pressurized propellant generates temperature and density anisotropies. This challenges the standard mass gauging methods that rely on the use of time changing skin-temperatures and pressures. We observe that the domes of the tanks are prone to be overheated, and that a long time after the heaters of the thermal cycle are switched off, the system reaches a quasi-equilibrium state with a more uniform density. We propose a new gauging method, which we call the Improved PVT method, based on universal physics and thermodynamics principles, existing TRL-9 technology and telemetry data. This method only uses as inputs the temperature and pressure readings of sensors externally attached to the tank. These sensors can operate during the nominal thermal duty cycle. The improved PVT method shows little sensitivity to the pressure sensor drifts which are critical towards the end-of-life of the missions, as well as little sensitivity to systematic temperature errors. The retrieval method has been validated experimentally with CO2 in gas and fluid state in a chamber that operates up to 82 bar within a nominal thermal cycle of 38 °C to 42 °C. The mass gauging error is shown to be lower than 1% the mass at the beginning of life, assuming an initial tank load at 100 bar. In particular, for a pressure of about 70 bar, just below the critical pressure of CO2, the error of the mass gauging in gas phase goes down to 0.1% and for 77 bar, just above the critical point, the error of the mass gauging of the liquid phase is 0.6% of initial tank load. This gauging method improves by a factor of 8 the accuracy of the standard PVT retrievals using look-up tables with tabulated data from the National Institute of Standards and Technology.Keywords: electric propulsion, mass gauging, propellant, PVT, xenon
Procedia PDF Downloads 345129 i2kit: A Tool for Immutable Infrastructure Deployments
Authors: Pablo Chico De Guzman, Cesar Sanchez
Abstract:
Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.Keywords: container, deployment, immutable infrastructure, microservice
Procedia PDF Downloads 179128 The Accuracy of an In-House Developed Computer-Assisted Surgery Protocol for Mandibular Micro-Vascular Reconstruction
Authors: Christophe Spaas, Lies Pottel, Joke De Ceulaer, Johan Abeloos, Philippe Lamoral, Tom De Backer, Calix De Clercq
Abstract:
We aimed to evaluate the accuracy of an in-house developed low-cost computer-assisted surgery (CAS) protocol for osseous free flap mandibular reconstruction. All patients who underwent primary or secondary mandibular reconstruction with a free (solely or composite) osseous flap, either a fibula free flap or iliac crest free flap, between January 2014 and December 2017 were evaluated. The low-cost protocol consisted out of a virtual surgical planning, a prebend custom reconstruction plate and an individualized free flap positioning guide. The accuracy of the protocol was evaluated through comparison of the postoperative outcome with the 3D virtual planning, based on measurement of the following parameters: intercondylar distance, mandibular angle (axial and sagittal), inner angular distance, anterior-posterior distance, length of the fibular/iliac crest segments and osteotomy angles. A statistical analysis of the obtained values was done. Virtual 3D surgical planning and cutting guide design were performed with Proplan CMF® software (Materialise, Leuven, Belgium) and IPS Gate (KLS Martin, Tuttlingen, Germany). Segmentation of the DICOM data as well as outcome analysis were done with BrainLab iPlan® Software (Brainlab AG, Feldkirchen, Germany). A cost analysis of the protocol was done. Twenty-two patients (11 fibula /11 iliac crest) were included and analyzed. Based on voxel-based registration on the cranial base, 3D virtual planning landmark parameters did not significantly differ from those measured on the actual treatment outcome (p-values >0.05). A cost evaluation of the in-house developed CAS protocol revealed a 1750 euro cost reduction in comparison with a standard CAS protocol with a patient-specific reconstruction plate. Our results indicate that an accurate transfer of the planning with our in-house developed low-cost CAS protocol is feasible at a significant lower cost.Keywords: CAD/CAM, computer-assisted surgery, low-cost, mandibular reconstruction
Procedia PDF Downloads 140127 Reduction Shrinkage of Concrete without Use Reinforcement
Authors: Martin Tazky, Rudolf Hela, Lucia Osuska, Petr Novosad
Abstract:
Concrete’s volumetric changes are natural process caused by silicate minerals’ hydration. These changes can lead to cracking and subsequent destruction of cementitious material’s matrix. In most cases, cracks can be assessed as a negative effect of hydration, and in all cases, they lead to an acceleration of degradation processes. Preventing the formation of these cracks is, therefore, the main effort. Once of the possibility how to eliminate this natural concrete shrinkage process is by using different types of dispersed reinforcement. For this application of concrete shrinking, steel and polymer reinforcement are preferably used. Despite ordinarily used reinforcement in concrete to eliminate shrinkage it is possible to look at this specific problematic from the beginning by itself concrete mix composition. There are many secondary raw materials, which are helpful in reduction of hydration heat and also with shrinkage of concrete during curing. The new science shows the possibilities of shrinkage reduction also by the controlled formation of hydration products, which could act by itself morphology as a traditionally used dispersed reinforcement. This contribution deals with the possibility of controlled formation of mono- and tri-sulfate which are considered like degradation minerals. Mono- and tri- sulfate's controlled formation in a cementitious composite can be classified as a self-healing ability. Its crystal’s growth acts directly against the shrinking tension – this reduces the risk of cracks development. Controlled formation means that these crystals start to grow in the fresh state of the material (e.g. concrete) but stop right before it could cause any damage to the hardened material. Waste materials with the suitable chemical composition are very attractive precursors because of their added value in the form of landscape pollution’s reduction and, of course, low cost. In this experiment, the possibilities of using the fly ash from fluidized bed combustion as a mono- and tri-sulphate formation additive were investigated. The experiment itself was conducted on cement paste and concrete and specimens were subjected to a thorough analysis of physicomechanical properties as well as microstructure from the moment of mixing up to 180 days. In cement composites, were monitored the process of hydration and shrinkage. In a mixture with the used admixture of fluidized bed combustion fly ash, possible failures were specified by electronic microscopy and dynamic modulus of elasticity. The results of experiments show the possibility of shrinkage concrete reduction without using traditionally dispersed reinforcement.Keywords: shrinkage, monosulphates, trisulphates, self-healing, fluidized fly ash
Procedia PDF Downloads 186126 Introducing the Concept of Sustainable Learning: Redesigning the Social Studies and Citizenship Education Curriculum in the Context of Saudi Arabia
Authors: Aiydh Aljeddani, Fran Martin
Abstract:
Sustainable human development is an essential component of a sustainable economic, social and environmental development. Addressing sustainable learning only through the addition of new teaching methods, or embedding certain approaches, is not sufficient on its own to support the goals of sustainable human development. This research project seeks to explore how the process of redesigning the current principles of curriculum based on the concept of sustainable learning could contribute to preparing a citizen who could later contribute towards sustainable human development. Multiple qualitative methodologies were employed in order to achieve the aim of this study. The main research methods were teachers’ field notes, artefacts, informal interviews (unstructured interview), a passive participant observation, a mini nominal group technique (NGT), a weekly diary, and weekly meeting. The study revealed that the integration of a curriculum for sustainable development, in addition to the use of innovative teaching approaches, highly valued by students and teachers in social studies’ sessions. This was due to the fact that it created a positive atmosphere for interaction and aroused both teachers and students’ interest. The content of the new curriculum also contributed to increasing students’ sense of shared responsibility through involving them in thinking about solutions for some global issues. This was carried out through addressing these issues through the concept of sustainable development and the theory of Thinking Activity in a Social Context (TASC). Students had interacted with sustainable development sessions intellectually and they also practically applied it through designing projects and cut-outs. Ongoing meetings and workshops to develop work between both the researcher and the teachers, and by the teachers themselves, played a vital role in implementing the new curriculum. The participation of teachers in the development of the project through working papers, exchanging experiences and introducing amendments to the students' environment was also critical in the process of implementing the new curriculum. Finally, the concept of sustainable learning can contribute to the learning outcomes much better than the current curriculum and it can better develop the learning objectives in educational institutions.Keywords: redesigning, social studies and citizenship education curriculum, sustainable learning, thinking activity in a social context
Procedia PDF Downloads 231125 Task Based Functional Connectivity within Reward Network in Food Image Viewing Paradigm Using Functional MRI
Authors: Preetham Shankapal, Jill King, Kori Murray, Corby Martin, Paula Giselman, Jason Hicks, Owen Carmicheal
Abstract:
Activation of reward and satiety networks in the brain while processing palatable food cues, as well as functional connectivity during rest has been studied using functional Magnetic Resonance Imaging of the brain in various obesity phenotypes. However, functional connectivity within the reward and satiety network during food cue processing is understudied. 14 obese individuals underwent two fMRI scans during viewing of Macronutrient Picture System images. Each scan included two blocks of images of High Sugar/High Fat (HSHF), High Carbohydrate/High Fat (HCHF), Low Sugar/Low Fat (LSLF) and also non-food images. Seed voxels within seven food reward relevant ROIs: Insula, putamen and cingulate, precentral, parahippocampal, medial frontal and superior temporal gyri were isolated based on a prior meta-analysis. Beta series correlation for task-related functional connectivity between these seed voxels and the rest of the brain was computed. Voxel-level differences in functional connectivity were calculated between: first and the second scan; individuals who saw novel (N=7) vs. Repeated (N=7) images in the second scan; and between the HC/HF, HSHF blocks vs LSLF and non-food blocks. Computations and analysis showed that during food image viewing, reward network ROIs showed significant functional connectivity with each other and with other regions responsible for attentional and motor control, including inferior parietal lobe and precentral gyrus. These functional connectivity values were heightened among individuals who viewed novel HS/HF images in the second scan. In the second scan session, functional connectivity was reduced within the reward network but increased within attention, memory and recognition regions, suggesting habituation to reward properties and increased recollection of previously viewed images. In conclusion it can be inferred that Functional Connectivity within reward network and between reward and other brain regions, varies by important experimental conditions during food photography viewing, including habituation to shown foods.Keywords: fMRI, functional connectivity, task-based, beta series correlation
Procedia PDF Downloads 270124 Effective Medium Approximations for Modeling Ellipsometric Responses from Zinc Dialkyldithiophosphates (ZDDP) Tribofilms Formed on Sliding Surfaces
Authors: Maria Miranda-Medina, Sara Salopek, Andras Vernes, Martin Jech
Abstract:
Sliding lubricated surfaces induce the formation of tribofilms that reduce friction, wear and prevent large-scale damage of contact parts. Engine oils and lubricants use antiwear and antioxidant additives such as zinc dialkyldithiophosphate (ZDDP) from where protective tribofilms are formed by degradation. The ZDDP tribofilms are described as a two-layer structure composed of inorganic polymer material. On the top surface, the long chain polyphosphate is a zinc phosphate and in the bulk, the short chain polyphosphate is a mixed Fe/Zn phosphate with a gradient concentration. The polyphosphate chains are partially adherent to steel surface through a sulfide and work as anti-wear pads. In this contribution, ZDDP tribofilms formed on gray cast iron surfaces are studied. The tribofilms were generated in a reciprocating sliding tribometer with a piston ring-cylinder liner configuration. Fully formulated oil of SAE grade 5W-30 was used as lubricant during two tests at 40Hz and 50Hz. For the estimation of the tribofilm thicknesses, spectroscopic ellipsometry was used due to its high accuracy and non-destructive nature. Ellipsometry works under an optical principle where the change in polarisation of light reflected by the surface, is associated with the refractive index of the surface material or to the thickness of the layer deposited on top. Ellipsometrical responses derived from tribofilms are modelled by effective medium approximation (EMA), which includes the refractive index of involved materials, homogeneity of the film and thickness. The materials composition was obtained from x-ray photoelectron spectroscopic studies, where the presence of ZDDP, O and C was confirmed. From EMA models it was concluded that tribofilms formed at 40 Hz are thicker and more homogeneous than the ones formed at 50 Hz. In addition, the refractive index of each material is mixed to derive an effective refractive index that describes the optical composition of the tribofilm and exhibits a maximum response in the UV range, being a characteristic of glassy semitransparent films.Keywords: effective medium approximation, reciprocating sliding tribometer, spectroscopic ellipsometry, zinc dialkyldithiophosphate
Procedia PDF Downloads 251123 The Sociocultural, Economic, and Environmental Contestations of Agbogbloshie: A Critical Review
Authors: Khiddir Iddris, Martin Oteng – Ababio, Andreas Bürkert, Christoph Scherrer, Katharina Hemmler
Abstract:
Agbogbloshie, as an informal settlement and economy where the e-waste sector thrives, has become a global hub of complex urban contestations involving sociocultural, economic, and environmental dimensions due to the implication that e-waste and informal economic patterns have on livelihoods, urbanisation, development and sustainability. Multi-author collaborations have produced an ever-growing body of literature on Agbogbloshie and the informal e-waste economy. There is, however, a dearth of an assessment of Agbogbloshie as an urban informal settlement's intricate nexus of socioecological contestations. We address this gap by systematising, from literature, the context knowledge, navigating the complex terrain of Agbogbloshie's challenges, and employing a multidimensional lens to unravel the sociocultural intricacies, economic dynamics, and environmental complexities shaping its identity. A systematic critical review approach was espoused, with a pragmatic consolidation of content analysis and controversy mapping, grounded on the concept of ‘sustainable rurbanism,’ highlighted core themes and identified contrasting viewpoints. An analytical framework is presented. Five categories – geohistorical, sociocultural, economic, environmental and future trends - are proposed as an approach to systematising the literature. The review finds that the sociocultural dimension unveils a mosaic of cultural amalgamation, communal identity, and tensions impacting community cohesion. The analysis of economic intricacies reveals the prevalence of informal economies sustaining livelihoods yet entrenching economic disparities and marginalisation. Environmental scrutiny exposes the grim realities of e-waste disposal, pollution, and land use conflicts. The findings suggest that there is a high resilience within the community and the potential for sustainable trajectories. Theoretical and conceptual synergy is limited. This review provides a comprehensive exploration, offering insights and directions for future research, policy formulation, and community-driven interventions aimed at fostering sustainable transformations in Agbogbloshie and analogous urban contexts.Keywords: Agbogbloshie, economic complexities, environmental challenges, resilience, sociocultural dynamics, sustainability, urban informal settlement
Procedia PDF Downloads 71122 The Shrinking of the Pink Wave and the Rise of the Right-Wing in Latin America
Authors: B. M. Moda, L. F. Secco
Abstract:
Through free and fair elections and others less democratic processes, Latin America has been gradually turning into a right-wing political region. In order to understand these recent changes, this paper aims to discuss the origin and the traits of the pink wave in the subcontinent, the reasons for its current rollback and future projections for left-wing in the region. The methodology used in this paper will be descriptive and analytical combined with secondary sources mainly from the social and political sciences fields. The canons of the Washington Consensus was implemented by the majority of the Latin American governments in the 80s and 90s under the social democratic and right-wing parties. The neoliberal agenda caused political, social and economic dissatisfaction bursting into a new political configuration for the region. It started in 1998 when Hugo Chávez took the office in Venezuela through the Fifth Republic Movement under the socialist flag. From there on, Latin America was swiped by the so-called ‘pink wave’, term adopted to define the rising of self-designated left-wing or center-left parties with a progressive agenda. After Venezuela, countries like Chile, Brazil, Argentina, Uruguay, Bolivia, Equator, Nicaragua, Paraguay, El Salvador and Peru got into the pink wave. The success of these governments was due a post-neoliberal agenda focused on cash transfers programs, increasing of public spending, and the straightening of national market. The discontinuation of the preference for the left-wing started in 2012 with the coup against Fernando Lugo in Paraguay. In 2015, the chavismo in Venezuela lost the majority of the legislative seats. In 2016, an impeachment removed the Brazilian president Dilma Rousself from office who was replaced by the center-right vice-president Michel Temer. In the same year, Mauricio Macri representing the right-wing party Proposta Republicana was elected in Argentina. In 2016 center-right and liberal, Pedro Pablo Kuczynski was elected in Peru. In 2017, Sebastián Piñera was elected in Chile through the center-right party Renovación Nacional. The pink wave current rollback points towards some findings that can be arranged in two fields. Economically, the 2008 financial crisis affected the majority of the Latin American countries and the left-wing economic policies along with the end of the raw materials boom and the subsequent shrinking of economic performance opened a flank for popular dissatisfaction. In Venezuela, the 2014 oil crisis reduced the revenues for the State in more than 50% dropping social spending, creating an inflationary spiral, and consequently loss of popular support. Politically, the death of Hugo Chavez in 2013 weakened the ‘socialism of the twenty first century’ ideal, which was followed by the death of Fidel Castro, the last bastion of communism in the subcontinent. In addition, several cases of corruption revealed during the pink wave governments made the traditional politics unpopular. These issues challenge the left-wing to develop a future agenda based on innovation of its economic program, improve its legal and political compliance practices, and to regroup its electoral forces amid the social movements that supported its ascension back in the early 2000s.Keywords: Latin America, political parties, left-wing, right-wing, pink wave
Procedia PDF Downloads 240121 Human Rights in the United States: Challenges and Lessons from the Period 1948-2018
Authors: Mary Carmen Peloche Barrera
Abstract:
Since its early years as an independent nation, the United States has been one of the main promoters regarding the recognition, legislation, and protection of human rights. In the matter of freedom, the founding father Thomas Jefferson envisioned the role of the U.S. as a defender of freedom and equality throughout the world. This founding ideal shaped America’s domestic and foreign policy in the 19th and the 20th century and became an aspiration of the ideals of the country to expand its values and institutions. The history of the emergence of human rights cannot be studied without making reference to leaders such as Woodrow Wilson, Franklin, and Eleanor Roosevelt, as well as Martin Luther King. Throughout its history, this country has proclaimed that the protection of the freedoms of men, both inside and outside its borders, is practically the reason for its existence. Although the United States was one of the first countries to recognize the existence of inalienable rights for individuals, as well as the main promoter of the Universal Declaration of Human Rights of 1948, the country has gone through critical moments that had led to questioning its commitment to the issue. Racial segregation, international military interventions, national security strategy, as well as national legislation on immigration, are some of the most controversial issues related to decisions and actions driven by the United States, which at the same time mismatched with its role as an advocate of human rights, both in the Americas and in the rest of the world. The aim of this paper is to study the swinging of the efforts and commitments of the United States towards human rights. The paper will analyze the history and evolution of human rights in the United States, to study the greatest challenges for the country in this matter. The paper will focus on both the domestic policy (related to demographic issues) and foreign policy (about its role in a post-war world). Currently, more countries are joining the multilateral efforts for the promotion and protection of human rights. At the same time, the United States is one of the least committed countries in this respect, having ratified only 5 of the 18 treaties emanating from the United Nations. The last ratification was carried out in 2002 and, since then, the country has been losing ground, in an increasingly vertiginous way, in its credibility and, even worse, in its role as leader of 'the free world'. With or without the United States, the protection of human rights should remain the main goal of the international community.Keywords: United States, human rights, foreign policy, domestic policy
Procedia PDF Downloads 118120 Development and Validation of an Instrument Measuring the Coping Strategies in Situations of Stress
Authors: Lucie Côté, Martin Lauzier, Guy Beauchamp, France Guertin
Abstract:
Stress causes deleterious effects to the physical, psychological and organizational levels, which highlight the need to use effective coping strategies to deal with it. Several coping models exist, but they don’t integrate the different strategies in a coherent way nor do they take into account the new research on the emotional coping and acceptance of the stressful situation. To fill these gaps, an integrative model incorporating the main coping strategies was developed. This model arises from the review of the scientific literature on coping and from a qualitative study carried out among workers with low or high levels of stress, as well as from an analysis of clinical cases. The model allows one to understand under what circumstances the strategies are effective or ineffective and to learn how one might use them more wisely. It includes Specific Strategies in controllable situations (the Modification of the Situation and the Resignation-Disempowerment), Specific Strategies in non-controllable situations (Acceptance and Stubborn Relentlessness) as well as so-called General Strategies (Wellbeing and Avoidance). This study is intended to undertake and present the process of development and validation of an instrument to measure coping strategies based on this model. An initial pool of items has been generated from the conceptual definitions and three expert judges have validated the content. Of these, 18 items have been selected for a short form questionnaire. A sample of 300 students and employees from a Quebec university was used for the validation of the questionnaire. Concerning the reliability of the instrument, the indices observed following the inter-rater agreement (Krippendorff’s alpha) and the calculation of the coefficients for internal consistency (Cronbach's alpha) are satisfactory. To evaluate the construct validity, a confirmatory factor analysis using MPlus supports the existence of a model with six factors. The results of this analysis suggest also that this configuration is superior to other alternative models. The correlations show that the factors are only loosely related to each other. Overall, the analyses carried out suggest that the instrument has good psychometric qualities and demonstrates the relevance of further work to establish predictive validity and reconfirm its structure. This instrument will help researchers and clinicians better understand and assess coping strategies to cope with stress and thus prevent mental health issues.Keywords: acceptance, coping strategies, stress, validation process
Procedia PDF Downloads 339119 Sound Source Localisation and Augmented Reality for On-Site Inspection of Prefabricated Building Components
Authors: Jacques Cuenca, Claudio Colangeli, Agnieszka Mroz, Karl Janssens, Gunther Riexinger, Antonio D'Antuono, Giuseppe Pandarese, Milena Martarelli, Gian Marco Revel, Carlos Barcena Martin
Abstract:
This study presents an on-site acoustic inspection methodology for quality and performance evaluation of building components. The work focuses on global and detailed sound source localisation, by successively performing acoustic beamforming and sound intensity measurements. A portable experimental setup is developed, consisting of an omnidirectional broadband acoustic source and a microphone array and sound intensity probe. Three main acoustic indicators are of interest, namely the sound pressure distribution on the surface of components such as walls, windows and junctions, the three-dimensional sound intensity field in the vicinity of junctions, and the sound transmission loss of partitions. The measurement data is post-processed and converted into a three-dimensional numerical model of the acoustic indicators with the help of the simultaneously acquired geolocation information. The three-dimensional acoustic indicators are then integrated into an augmented reality platform superimposing them onto a real-time visualisation of the spatial environment. The methodology thus enables a measurement-supported inspection process of buildings and the correction of errors during construction and refurbishment. Two experimental validation cases are shown. The first consists of a laboratory measurement on a full-scale mockup of a room, featuring a prefabricated panel. The latter is installed with controlled defects such as lack of insulation and joint sealing material. It is demonstrated that the combined acoustic and augmented reality tool is capable of identifying acoustic leakages from the building defects and assist in correcting them. The second validation case is performed on a prefabricated room at a near-completion stage in the factory. With the help of the measurements and visualisation tools, the homogeneity of the partition installation is evaluated and leakages from junctions and doors are identified. Furthermore, the integration of acoustic indicators together with thermal and geometrical indicators via the augmented reality platform is shown.Keywords: acoustic inspection, prefabricated building components, augmented reality, sound source localization
Procedia PDF Downloads 384118 Influence of Smoking on Fine And Ultrafine Air Pollution Pm in Their Pulmonary Genetic and Epigenetic Toxicity
Authors: Y. Landkocz, C. Lepers, P.J. Martin, B. Fougère, F. Roy Saint-Georges. A. Verdin, F. Cazier, F. Ledoux, D. Courcot, F. Sichel, P. Gosset, P. Shirali, S. Billet
Abstract:
In 2013, the International Agency for Research on Cancer (IARC) classified air pollution and fine particles as carcinogenic to humans. Causal relationships exist between elevated ambient levels of airborne particles and increase of mortality and morbidity including pulmonary diseases, like lung cancer. However, due to a double complexity of both physicochemical Particulate Matter (PM) properties and tumor mechanistic processes, mechanisms of action remain not fully elucidated. Furthermore, because of several common properties between air pollution PM and tobacco smoke, like the same route of exposure and chemical composition, potential mechanisms of synergy could exist. Therefore, smoking could be an aggravating factor of the particles toxicity. In order to identify some mechanisms of action of particles according to their size, two samples of PM were collected: PM0.03 2.5 and PM0.33 2.5 in the urban-industrial area of Dunkerque. The overall cytotoxicity of the fine particles was determined on human bronchial cells (BEAS-2B). Toxicological study focused then on the metabolic activation of the organic compounds coated onto PM and some genetic and epigenetic changes induced on a co-culture model of BEAS-2B and alveolar macrophages isolated from bronchoalveolar lavages performed in smokers and non-smokers. The results showed (i) the contribution of the ultrafine fraction of atmospheric particles to genotoxic (eg. DNA double-strand breaks) and epigenetic mechanisms (eg. promoter methylation) involved in tumor processes, and (ii) the influence of smoking on the cellular response. Three main conclusions can be discussed. First, our results showed the ability of the particles to induce deleterious effects potentially involved in the stages of initiation and promotion of carcinogenesis. The second conclusion is that smoking affects the nature of the induced genotoxic effects. Finally, the in vitro developed cell model, using bronchial epithelial cells and alveolar macrophages can take into account quite realistically, some of the existing cell interactions existing in the lung.Keywords: air pollution, fine and ultrafine particles, genotoxic and epigenetic alterations, smoking
Procedia PDF Downloads 347117 Low Energy Technology for Leachate Valorisation
Authors: Jesús M. Martín, Francisco Corona, Dolores Hidalgo
Abstract:
Landfills present long-term threats to soil, air, groundwater and surface water due to the formation of greenhouse gases (methane gas and carbon dioxide) and leachate from decomposing garbage. The composition of leachate differs from site to site and also within the landfill. The leachates alter with time (from weeks to years) since the landfilled waste is biologically highly active and their composition varies. Mainly, the composition of the leachate depends on factors such as characteristics of the waste, the moisture content, climatic conditions, degree of compaction and the age of the landfill. Therefore, the leachate composition cannot be generalized and the traditional treatment models should be adapted in each case. Although leachate composition is highly variable, what different leachates have in common is hazardous constituents and their potential eco-toxicological effects on human health and on terrestrial ecosystems. Since leachate has distinct compositions, each landfill or dumping site would represent a different type of risk on its environment. Nevertheless, leachates consist always of high organic concentration, conductivity, heavy metals and ammonia nitrogen. Leachate could affect the current and future quality of water bodies due to uncontrolled infiltrations. Therefore, control and treatment of leachate is one of the biggest issues in urban solid waste treatment plants and landfills design and management. This work presents a treatment model that will be carried out "in-situ" using a cost-effective novel technology that combines solar evaporation/condensation plus forward osmosis. The plant is powered by renewable energies (solar energy, biomass and residual heat), which will minimize the carbon footprint of the process. The final effluent quality is very high, allowing reuse (preferred) or discharge into watercourses. In the particular case of this work, the final effluents will be reused for cleaning and gardening purposes. A minority semi-solid residual stream is also generated in the process. Due to its special composition (rich in metals and inorganic elements), this stream will be valorized in ceramic industries to improve the final products characteristics.Keywords: forward osmosis, landfills, leachate valorization, solar evaporation
Procedia PDF Downloads 202116 Ionic Liquids as Substrates for Metal-Organic Framework Synthesis
Authors: Julian Mehler, Marcus Fischer, Martin Hartmann, Peter S. Schulz
Abstract:
During the last two decades, the synthesis of metal-organic frameworks (MOFs) has gained ever increasing attention. Based on their pore size and shape as well as host-guest interactions, they are of interest for numerous fields related to porous materials, like catalysis and gas separation. Usually, MOF-synthesis takes place in an organic solvent between room temperature and approximately 220 °C, with mixtures of polyfunctional organic linker molecules and metal precursors as substrates. Reaction temperatures above the boiling point of the solvent, i.e. solvothermal reactions, are run in autoclaves or sealed glass vessels under autogenous pressures. A relatively new approach for the synthesis of MOFs is the so-called ionothermal synthesis route. It applies an ionic liquid as a solvent, which can serve as a structure-directing template and/or a charge-compensating agent in the final coordination polymer structure. Furthermore, this method often allows for less harsh reaction conditions than the solvothermal route. Here a variation of the ionothermal approach is reported, where the ionic liquid also serves as an organic linker source. By using 1-ethyl-3-methylimidazolium terephthalates ([EMIM][Hbdc] and [EMIM]₂[bdc]), the one-step synthesis of MIL-53(Al)/Boehemite composites with interesting features is possible. The resulting material is already formed at moderate temperatures (90-130 °C) and is stabilized in the usually unfavored ht-phase. Additionally, in contrast to already published procedures for MIL-53(Al) synthesis, no further activation at high temperatures is mandatory. A full characterization of this novel composite material is provided, including XRD, SS-NMR, El-Al., SEM as well as sorption measurements and its interesting features are compared to MIL-53(Al) samples produced by the classical solvothermal route. Furthermore, the syntheses of the applied ionic liquids and salts is discussed. The influence of the degree of ionicity of the linker source [EMIM]x[H(2-x)bdc] on the crystal structure and the achievable synthesis temperature are investigated and give insight into the role of the IL during synthesis. Aside from the synthesis of MIL-53 from EMIM terephthalates, the use of the phosphonium cation in this approach is discussed as well. Additionally, the employment of ILs in the preparation of other MOFs is presented briefly. This includes the ZIF-4 framework from the respective imidazolate ILs and chiral camphorate based frameworks from their imidazolium precursors.Keywords: ionic liquids, ionothermal synthesis, material synthesis, MIL-53, MOFs
Procedia PDF Downloads 208115 Waste Management Option for Bioplastics Alongside Conventional Plastics
Authors: Dan Akesson, Gauthaman Kuzhanthaivelu, Martin Bohlen, Sunil K. Ramamoorthy
Abstract:
Bioplastics can be defined as polymers derived partly or completely from biomass. Bioplastics can be biodegradable such as polylactic acid (PLA) and polyhydroxyalkonoates (PHA); or non-biodegradable (biobased polyethylene (bio-PE), polypropylene (bio-PP), polyethylene terephthalate (bio-PET)). The usage of such bioplastics is expected to increase in the future due to new found interest in sustainable materials. At the same time, these plastics become a new type of waste in the recycling stream. Most countries do not have separate bioplastics collection for it to be recycled or composted. After a brief introduction of bioplastics such as PLA in the UK, these plastics are once again replaced by conventional plastics by many establishments due to lack of commercial composting. Recycling companies fear the contamination of conventional plastic in the recycling stream and they said they would have to invest in expensive new equipment to separate bioplastics and recycle it separately. This project studies what happens when bioplastics contaminate conventional plastics. Three commonly used conventional plastics were selected for this study: polyethylene (PE), polypropylene (PP) and polyethylene terephthalate (PET). In order to simulate contamination, two biopolymers, either polyhydroxyalkanoate (PHA) or thermoplastic starch (TPS) were blended with the conventional polymers. The amount of bioplastics in conventional plastics was either 1% or 5%. The blended plastics were processed again to see the effect of degradation. The results from contamination showed that the tensile strength and the modulus of PE was almost unaffected whereas the elongation is clearly reduced indicating the increase in brittleness of the plastic. Generally, it can be said that PP is slightly more sensitive to the contamination than PE. This can be explained by the fact that the melting point of PP is higher than for PE and as a consequence, the biopolymer will degrade more quickly. However, the reduction of the tensile properties for PP is relatively modest. Impact strength is generally a more sensitive test method towards contamination. Again, PE is relatively unaffected by the contamination but for PP there is a relatively large reduction of the impact properties already at 1% contamination. PET is polyester, and it is, by its very nature, more sensitive to degradation than PE and PP. PET also has a much higher melting point than PE and PP, and as a consequence, the biopolymer will quickly degrade at the processing temperature of PET. As for the tensile strength, PET can tolerate 1% contamination without any reduction of the tensile strength. However, when the impact strength is examined, it is clear that already at 1% contamination, there is a strong reduction of the properties. The thermal properties show the change in the crystallinity. The blends were also characterized by SEM. Biphasic morphology can be seen as the two polymers are not truly blendable which also contributes to reduced mechanical properties. The study shows that PE is relatively robust against contamination, while polypropylene (PP) is sensitive and polyethylene terephthalate (PET) can be quite sensitive towards contamination.Keywords: bioplastics, contamination, recycling, waste management
Procedia PDF Downloads 225114 ScRNA-Seq RNA Sequencing-Based Program-Polygenic Risk Scores Associated with Pancreatic Cancer Risks in the UK Biobank Cohort
Authors: Yelin Zhao, Xinxiu Li, Martin Smelik, Oleg Sysoev, Firoj Mahmud, Dina Mansour Aly, Mikael Benson
Abstract:
Background: Early diagnosis of pancreatic cancer is clinically challenging due to vague, or no symptoms, and lack of biomarkers. Polygenic risk score (PRS) scores may provide a valuable tool to assess increased or decreased risk of PC. This study aimed to develop such PRS by filtering genetic variants identified by GWAS using transcriptional programs identified by single-cell RNA sequencing (scRNA-seq). Methods: ScRNA-seq data from 24 pancreatic ductal adenocarcinoma (PDAC) tumor samples and 11 normal pancreases were analyzed to identify differentially expressed genes (DEGs) in in tumor and microenvironment cell types compared to healthy tissues. Pathway analysis showed that the DEGs were enriched for hundreds of significant pathways. These were clustered into 40 “programs” based on gene similarity, using the Jaccard index. Published genetic variants associated with PDAC were mapped to each program to generate program PRSs (pPRSs). These pPRSs, along with five previously published PRSs (PGS000083, PGS000725, PGS000663, PGS000159, and PGS002264), were evaluated in a European-origin population from the UK Biobank, consisting of 1,310 PDAC participants and 407,473 non-pancreatic cancer participants. Stepwise Cox regression analysis was performed to determine associations between pPRSs with the development of PC, with adjustments of sex and principal components of genetic ancestry. Results: The PDAC genetic variants were mapped to 23 programs and were used to generate pPRSs for these programs. Four distinct pPRSs (P1, P6, P11, and P16) and two published PRSs (PGS000663 and PGS002264) were significantly associated with an increased risk of developing PC. Among these, P6 exhibited the greatest hazard ratio (adjusted HR[95% CI] = 1.67[1.14-2.45], p = 0.008). In contrast, P10 and P4 were associated with lower risk of developing PC (adjusted HR[95% CI] = 0.58[0.42-0.81], p = 0.001, and adjusted HR[95% CI] = 0.75[0.59-0.96], p = 0.019). By comparison, two of the five published PRS exhibited an association with PDAC onset with HR (PGS000663: adjusted HR[95% CI] = 1.24[1.14-1.35], p < 0.001 and PGS002264: adjusted HR[95% CI] = 1.14[1.07-1.22], p < 0.001). Conclusion: Compared to published PRSs, scRNA-seq-based pPRSs may be used not only to assess increased but also decreased risk of PDAC.Keywords: cox regression, pancreatic cancer, polygenic risk score, scRNA-seq, UK biobank
Procedia PDF Downloads 101113 Anajaa-Visual Substitution System: A Navigation Assistive Device for the Visually Impaired
Authors: Juan Pablo Botero Torres, Alba Avila, Luis Felipe Giraldo
Abstract:
Independent navigation and mobility through unknown spaces pose a challenge for the autonomy of visually impaired people (VIP), who have relied on the use of traditional assistive tools like the white cane and trained dogs. However, emerging visually assistive technologies (VAT) have proposed several human-machine interfaces (HMIs) that could improve VIP’s ability for self-guidance. Hereby, we introduce the design and implementation of a visually assistive device, Anajaa – Visual Substitution System (AVSS). This system integrates ultrasonic sensors with custom electronics, and computer vision models (convolutional neural networks), in order to achieve a robust system that acquires information of the surrounding space and transmits it to the user in an intuitive and efficient manner. AVSS consists of two modules: the sensing and the actuation module, which are fitted to a chest mount and belt that communicate via Bluetooth. The sensing module was designed for the acquisition and processing of proximity signals provided by an array of ultrasonic sensors. The distribution of these within the chest mount allows an accurate representation of the surrounding space, discretized in three different levels of proximity, ranging from 0 to 6 meters. Additionally, this module is fitted with an RGB-D camera used to detect potentially threatening obstacles, like staircases, using a convolutional neural network specifically trained for this purpose. Posteriorly, the depth data is used to estimate the distance between the stairs and the user. The information gathered from this module is then sent to the actuation module that creates an HMI, by the means of a 3x2 array of vibration motors that make up the tactile display and allow the system to deliver haptic feedback. The actuation module uses vibrational messages (tactones); changing both in amplitude and frequency to deliver different awareness levels according to the proximity of the obstacle. This enables the system to deliver an intuitive interface. Both modules were tested under lab conditions, and the HMI was additionally tested with a focal group of VIP. The lab testing was conducted in order to establish the processing speed of the computer vision algorithms. This experimentation determined that the model can process 0.59 frames per second (FPS); this is considered as an adequate processing speed taking into account that the walking speed of VIP is 1.439 m/s. In order to test the HMI, we conducted a focal group composed of two females and two males between the ages of 35-65 years. The subject selection was aided by the Colombian Cooperative of Work and Services for the Sightless (COOTRASIN). We analyzed the learning process of the haptic messages throughout five experimentation sessions using two metrics: message discrimination and localization success. These correspond to the ability of the subjects to recognize different tactones and locate them within the tactile display. Both were calculated as the mean across all subjects. Results show that the focal group achieved message discrimination of 70% and a localization success of 80%, demonstrating how the proposed HMI leads to the appropriation and understanding of the feedback messages, enabling the user’s awareness of its surrounding space.Keywords: computer vision on embedded systems, electronic trave aids, human-machine interface, haptic feedback, visual assistive technologies, vision substitution systems
Procedia PDF Downloads 81112 Enhancing Residential Architecture through Generative Design: Balancing Aesthetics, Legal Constraints, and Environmental Considerations
Authors: Milena Nanova, Radul Shishkov, Damyan Damov, Martin Georgiev
Abstract:
This research paper presents an in-depth exploration of the use of generative design in urban residential architecture, with a dual focus on aligning aesthetic values with legal and environmental constraints. The study aims to demonstrate how generative design methodologies can innovate residential building designs that are not only legally compliant and environmentally conscious but also aesthetically compelling. At the core of our research is a specially developed generative design framework tailored for urban residential settings. This framework employs computational algorithms to produce diverse design solutions, meticulously balancing aesthetic appeal with practical considerations. By integrating site-specific features, urban legal restrictions, and environmental factors, our approach generates designs that resonate with the unique character of urban landscapes while adhering to regulatory frameworks. The paper places emphasis on algorithmic implementation of the logical constraint and intricacies in residential architecture by exploring the potential of generative design to create visually engaging and contextually harmonious structures. This exploration also contains an analysis of how these designs align with legal building parameters, showcasing the potential for creative solutions within the confines of urban building regulations. Concurrently, our methodology integrates functional, economic, and environmental factors. We investigate how generative design can be utilized to optimize buildings' performance, considering them, aiming to achieve a symbiotic relationship between the built environment and its natural surroundings. Through a blend of theoretical research and practical case studies, this research highlights the multifaceted capabilities of generative design and demonstrates practical applications of our framework. Our findings illustrate the rich possibilities that arise from an algorithmic design approach in the context of a vibrant urban landscape. This study contributes an alternative perspective to residential architecture, suggesting that the future of urban development lies in embracing the complex interplay between computational design innovation, regulatory adherence, and environmental responsibility.Keywords: generative design, computational design, parametric design, algorithmic modeling
Procedia PDF Downloads 65