Search results for: N. V. David
99 Optimization of Operational Water Quality Parameters in a Drinking Water Distribution System Using Response Surface Methodology
Authors: Sina Moradi, Christopher W. K. Chow, John Van Leeuwen, David Cook, Mary Drikas, Patrick Hayde, Rose Amal
Abstract:
Chloramine is commonly used as a disinfectant in drinking water distribution systems (DWDSs), particularly in Australia and the USA. Maintaining a chloramine residual throughout the DWDS is important in ensuring microbiologically safe water is supplied at the customer’s tap. In order to simulate how chloramine behaves when it moves through the distribution system, a water quality network model (WQNM) can be applied. In this work, the WQNM was based on mono-chloramine decomposition reactions, which enabled prediction of mono-chloramine residual at different locations through a DWDS in Australia, using the Bentley commercial hydraulic package (Water GEMS). The accuracy of WQNM predictions is influenced by a number of water quality parameters. Optimization of these parameters in order to obtain the closest results in comparison with actual measured data in a real DWDS would result in both cost reduction as well as reduction in consumption of valuable resources such as energy and materials. In this work, the optimum operating conditions of water quality parameters (i.e. temperature, pH, and initial mono-chloramine concentration) to maximize the accuracy of mono-chloramine residual predictions for two water supply scenarios in an entire network were determined using response surface methodology (RSM). To obtain feasible and economical water quality parameters for highest model predictability, Design Expert 8.0 software (Stat-Ease, Inc.) was applied to conduct the optimization of three independent water quality parameters. High and low levels of the water quality parameters were considered, inevitably, as explicit constraints, in order to avoid extrapolation. The independent variables were pH, temperature and initial mono-chloramine concentration. The lower and upper limits of each variable for two water supply scenarios were defined and the experimental levels for each variable were selected based on the actual conditions in studied DWDS. It was found that at pH of 7.75, temperature of 34.16 ºC, and initial mono-chloramine concentration of 3.89 (mg/L) during peak water supply patterns, root mean square error (RMSE) of WQNM for the whole network would be minimized to 0.189, and the optimum conditions for averaged water supply occurred at pH of 7.71, temperature of 18.12 ºC, and initial mono-chloramine concentration of 4.60 (mg/L). The proposed methodology to predict mono-chloramine residual can have a great potential for water treatment plant operators in accurately estimating the mono-chloramine residual through a water distribution network. Additional studies from other water distribution systems are warranted to confirm the applicability of the proposed methodology for other water samples.Keywords: chloramine decay, modelling, response surface methodology, water quality parameters
Procedia PDF Downloads 22798 Evolutionary Advantages of Loneliness with an Agent-Based Model
Authors: David Gottlieb, Jason Yoder
Abstract:
The feeling of loneliness is not uncommon in modern society, and yet, there is a fundamental lack of understanding in its origins and purpose in nature. One interpretation of loneliness is that it is a subjective experience that punishes a lack of social behavior, and thus its emergence in human evolution is seemingly tied to the survival of early human tribes. Still, a common counterintuitive response to loneliness is a state of hypervigilance, resulting in social withdrawal, which may appear maladaptive to modern society. So far, no computational model of loneliness’ effect during evolution yet exists; however, agent-based models (ABM) can be used to investigate social behavior, and applying evolution to agents’ behaviors can demonstrate selective advantages for particular behaviors. We propose an ABM where each agent contains four social behaviors, and one goal-seeking behavior, letting evolution select the best behavioral patterns for resource allocation. In our paper, we use an algorithm similar to the boid model to guide the behavior of agents, but expand the set of rules that govern their behavior. While we use cohesion, separation, and alignment for simple social movement, our expanded model adds goal-oriented behavior, which is inspired by particle swarm optimization, such that agents move relative to their personal best position. Since agents are given the ability to form connections by interacting with each other, our final behavior guides agent movement toward its social connections. Finally, we introduce a mechanism to represent a state of loneliness, which engages when an agent's perceived social involvement does not meet its expected social involvement. This enables us to investigate a minimal model of loneliness, and using evolution we attempt to elucidate its value in human survival. Agents are placed in an environment in which they must acquire resources, as their fitness is based on the total resource collected. With these rules in place, we are able to run evolution under various conditions, including resource-rich environments, and when disease is present. Our simulations indicate that there is strong selection pressure for social behavior under circumstances where there is a clear discrepancy between initial resource locations, and against social behavior when disease is present, mirroring hypervigilance. This not only provides an explanation for the emergence of loneliness, but also reflects the diversity of response to loneliness in the real world. In addition, there is evidence of a richness of social behavior when loneliness was present. By introducing just two resource locations, we observed a divergence in social motivation after agents became lonely, where one agent learned to move to the other, who was in a better resource position. The results and ongoing work from this project show that it is possible to glean insight into the evolutionary advantages of even simple mechanisms of loneliness. The model we developed has produced unexpected results and has led to more questions, such as the impact loneliness would have at a larger scale, or the effect of creating a set of rules governing interaction beyond adjacency.Keywords: agent-based, behavior, evolution, loneliness, social
Procedia PDF Downloads 9797 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data
Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau
Abstract:
Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.Keywords: calcium imaging, computer vision, neural activity, neural networks
Procedia PDF Downloads 8396 Violent, Psychological, Sexual and Abuse-Related Emergency Department Usage amongst Pediatric Victims of Physical Assault and Gun Violence: A Case-Control Study
Authors: Mary Elizabeth Bernardin, Margie Batek, Joseph Moen, David Schnadower
Abstract:
Background: Injuries due to interpersonal violence are a common reason for emergency department (ED) visits amongst the American pediatric population. Gun violence, in particular, is associated with high morbidity, mortality as well as financial costs. Patterns of pediatric ED usage may be an indicator of risk for future violence, but very little data on the topic exists. Objective: The aims of this study were to assess for frequencies of ED usage for previous interpersonal violence, mental/behavioral issues, sexual/reproductive issues and concerns for abuse in youths presenting to EDs due to physical assault injuries (PAIs) compared to firearm injuries (FIs). Methods: In this retrospective case-control study, ED charts of children ages 8-19 years who presented with injuries due to interpersonal violent encounters from 2014-2017 were reviewed. Data was collected regarding all previous ED visits for injuries due to interpersonal violence (including physical assaults and firearm injuries), mental/behavioral health visits (including depression, suicidal ideation, suicide attempt, homicidal ideation and violent behavior), sexual/reproductive health visits (including sexually transmitted infections and pregnancy related issues), and concerns for abuse (including physical abuse or domestic violence, neglect, sexual abuse, sexual assault, and intimate partner violence). Logistic regression was used to identify predictors of gun violence based on previous ED visits amongst physical assault injured versus firearm injured youths. Results: A total of 407 patients presenting to the ED for an interpersonal violent encounter were analyzed, 251 (62%) of which were due to physical assault injuries (PAIs) and 156 (38%) due to firearm injuries (FIs). The majority of both PAI and FI patients had no previous history of ED visits for violence, mental/behavioral health, sexual/reproductive health or concern for abuse (60.8% PAI, 76.3% FI). 19.2% of PAI and 13.5% of FI youths had previous ED visits for physical assault injuries (OR 0.68, P=0.24, 95% CI 0.36 to 1.29). 1.6% of PAI and 3.2% of FI youths had a history of ED visits for previous firearm injuries (OR 3.6, P=0.34, 95% CI 0.04 to 2.95). 10% of PAI and 3.8% of FI youths had previous ED visits for mental/behavioral health issues (OR 0.91, P=0.80, 95% CI 0.43 to 1.93). 10% of PAI and 2.6% of FI youths had previous ED visits due to concerns for abuse (OR 0.76, P=0.55, 95% CI 0.31 to 1.86). Conclusions: There are no statistically significant differences between physical assault-injured and firearm-injured youths in terms of ED usage for previous violent injuries, mental/behavioral health visits, sexual/reproductive health visits or concerns for abuse. However, violently injured youths in this study have more than twice the number of previous ED usage for physical assaults and mental health visits than previous literature indicates. Data comparing ED usage of victims of interpersonal violence to nonviolent ED patients is needed, but this study supports the notion that EDs may be a useful place for identification of and enrollment in interventions for youths most at risk for future violence.Keywords: child abuse, emergency department usage, pediatric gun violence, pediatric interpersonal violence, pediatric mental health, pediatric reproductive health
Procedia PDF Downloads 23695 Metacognitive Processing in Early Readers: The Role of Metacognition in Monitoring Linguistic and Non-Linguistic Performance and Regulating Students' Learning
Authors: Ioanna Taouki, Marie Lallier, David Soto
Abstract:
Metacognition refers to the capacity to reflect upon our own cognitive processes. Although there is an ongoing discussion in the literature on the role of metacognition in learning and academic achievement, little is known about its neurodevelopmental trajectories in early childhood, when children begin to receive formal education in reading. Here, we evaluate the metacognitive ability, estimated under a recently developed Signal Detection Theory model, of a cohort of children aged between 6 and 7 (N=60), who performed three two-alternative-forced-choice tasks (two linguistic: lexical decision task, visual attention span task, and one non-linguistic: emotion recognition task) including trial-by-trial confidence judgements. Our study has three aims. First, we investigated how metacognitive ability (i.e., how confidence ratings track accuracy in the task) relates to performance in general standardized tasks related to students' reading and general cognitive abilities using Spearman's and Bayesian correlation analysis. Second, we assessed whether or not young children recruit common mechanisms supporting metacognition across the different task domains or whether there is evidence for domain-specific metacognition at this early stage of development. This was done by examining correlations in metacognitive measures across different task domains and evaluating cross-task covariance by applying a hierarchical Bayesian model. Third, using robust linear regression and Bayesian regression models, we assessed whether metacognitive ability in this early stage is related to the longitudinal learning of children in a linguistic and a non-linguistic task. Notably, we did not observe any association between students’ reading skills and metacognitive processing in this early stage of reading acquisition. Some evidence consistent with domain-general metacognition was found, with significant positive correlations between metacognitive efficiency between lexical and emotion recognition tasks and substantial covariance indicated by the Bayesian model. However, no reliable correlations were found between metacognitive performance in the visual attention span and the remaining tasks. Remarkably, metacognitive ability significantly predicted children's learning in linguistic and non-linguistic domains a year later. These results suggest that metacognitive skill may be dissociated to some extent from general (i.e., language and attention) abilities and further stress the importance of creating educational programs that foster students’ metacognitive ability as a tool for long term learning. More research is crucial to understand whether these programs can enhance metacognitive ability as a transferable skill across distinct domains or whether unique domains should be targeted separately.Keywords: confidence ratings, development, metacognitive efficiency, reading acquisition
Procedia PDF Downloads 15194 Honneth, Feenberg, and the Redemption of Critical Theory of Technology
Authors: David Schafer
Abstract:
Critical Theory is in sore need of a workable account of technology. It had one in the writings of Herbert Marcuse, or so it seemed until Jürgen Habermas mounted a critique in 'Technology and Science as Ideology' (Habermas, 1970) that decisively put it away. Ever since Marcuse’s work has been regarded outdated – a 'philosophy of consciousness' no longer seriously tenable. But with Marcuse’s view has gone the important insight that technology is no norm-free system (as Habermas portrays it) but can be laden with social bias. Andrew Feenberg is among a few serious scholars who have perceived this problem in post-Habermasian critical theory and has sought to revive a basically Marcusean account of technology. On his view, while so-called ‘technical elements’ that physically make up technologies are neutral with regard to social interests, there is a sense in which we may speak of a normative grammar or ‘technical code’ built-in to technology that can be socially biased in favor of certain groups over others (Feenberg, 2002). According to Feenberg, those perspectives on technology are reified which consider technology only by their technical elements to the neglect of their technical codes. Nevertheless, Feenberg’s account fails to explain what is normatively problematic with such reified views of technology. His plausible claim that they represent false perspectives on technology by itself does not explain how such views may be oppressive, even though Feenberg surely wants to be doing that stronger level of normative theorizing. Perceiving this deficit in his own account of reification, he tries to adopt Habermas’s version of systems-theory to ground his own critical theory of technology (Feenberg, 1999). But this is a curious move in light of Feenberg’s own legitimate critiques of Habermas’s portrayals of technology as reified or ‘norm-free.’ This paper argues that a better foundation may be found in Axel Honneth’s recent text, Freedom’s Right (Honneth, 2014). Though Honneth there says little explicitly about technology, he offers an implicit account of reification formulated in opposition to Habermas’s systems-theoretic approach. On this ‘normative functionalist’ account of reification, social spheres are reified when participants prioritize individualist ideals of freedom (moral and legal freedom) to the neglect of an intersubjective form of freedom-through-recognition that Honneth calls ‘social freedom.’ Such misprioritization is ultimately problematic because it is unsustainable: individual freedom is philosophically and institutionally dependent upon social freedom. The main difficulty in adopting Honneth’s social theory for the purposes of a theory of technology, however, is that the notion of social freedom is predicable only of social institutions, whereas it appears difficult to conceive of technology as an institution. Nevertheless, in light of Feenberg’s work, the idea that technology includes within itself a normative grammar (technical code) takes on much plausibility. To the extent that this normative grammar may be understood by the category of social freedom, Honneth’s dialectical account of the relationship between individual and social forms of freedom provides a more solid basis from which to ground the normative claims of Feenberg’s sociological account of technology than Habermas’s systems theory.Keywords: Habermas, Honneth, technology, Feenberg
Procedia PDF Downloads 19893 Development of an Automatic Computational Machine Learning Pipeline to Process Confocal Fluorescence Images for Virtual Cell Generation
Authors: Miguel Contreras, David Long, Will Bachman
Abstract:
Background: Microscopy plays a central role in cell and developmental biology. In particular, fluorescence microscopy can be used to visualize specific cellular components and subsequently quantify their morphology through development of virtual-cell models for study of effects of mechanical forces on cells. However, there are challenges with these imaging experiments, which can make it difficult to quantify cell morphology: inconsistent results, time-consuming and potentially costly protocols, and limitation on number of labels due to spectral overlap. To address these challenges, the objective of this project is to develop an automatic computational machine learning pipeline to predict cellular components morphology for virtual-cell generation based on fluorescence cell membrane confocal z-stacks. Methods: Registered confocal z-stacks of nuclei and cell membrane of endothelial cells, consisting of 20 images each, were obtained from fluorescence confocal microscopy and normalized through software pipeline for each image to have a mean pixel intensity value of 0.5. An open source machine learning algorithm, originally developed to predict fluorescence labels on unlabeled transmitted light microscopy cell images, was trained using this set of normalized z-stacks on a single CPU machine. Through transfer learning, the algorithm used knowledge acquired from its previous training sessions to learn the new task. Once trained, the algorithm was used to predict morphology of nuclei using normalized cell membrane fluorescence images as input. Predictions were compared to the ground truth fluorescence nuclei images. Results: After one week of training, using one cell membrane z-stack (20 images) and corresponding nuclei label, results showed qualitatively good predictions on training set. The algorithm was able to accurately predict nuclei locations as well as shape when fed only fluorescence membrane images. Similar training sessions with improved membrane image quality, including clear lining and shape of the membrane, clearly showing the boundaries of each cell, proportionally improved nuclei predictions, reducing errors relative to ground truth. Discussion: These results show the potential of pre-trained machine learning algorithms to predict cell morphology using relatively small amounts of data and training time, eliminating the need of using multiple labels in immunofluorescence experiments. With further training, the algorithm is expected to predict different labels (e.g., focal-adhesion sites, cytoskeleton), which can be added to the automatic machine learning pipeline for direct input into Principal Component Analysis (PCA) for generation of virtual-cell mechanical models.Keywords: cell morphology prediction, computational machine learning, fluorescence microscopy, virtual-cell models
Procedia PDF Downloads 20592 The Confluence between Autism Spectrum Disorder and the Schizoid Personality
Authors: Murray David Schane
Abstract:
Though years of clinical encounters with patients with autism spectrum disorders and those with a schizoid personality the many defining diagnostic features shared between these conditions have been explored and current neurobiological differences have been reviewed; and, critical and different treatment strategies for each have been devised. The paper compares and contrasts the apparent similarities between autism spectrum disorders and the schizoid personality are found in these DSM descriptive categories: restricted range of social-emotional reciprocity; poor non-verbal communicative behavior in social interactions; difficulty developing and maintaining relationships; detachment from social relationships; lack of the desire for or enjoyment of close relationships; and preference for solitary activities. In this paper autism, fundamentally a communicative disorder, is revealed to present clinically as a pervasive aversive response to efforts to engage with or be engaged by others. Autists with the Asperger presentation typically have language but have difficulty understanding humor, irony, sarcasm, metaphoric speech, and even narratives about social relationships. They also tend to seek sameness, possibly to avoid problems of social interpretation. Repetitive behaviors engage many autists as a screen against ambient noise, social activity, and challenging interactions. Also in this paper, the schizoid personality is revealed as a pattern of social avoidance, self-sufficiency and apparent indifference to others as a complex psychological defense against a deep, long-abiding fear of appropriation and perverse manipulation. Neither genetic nor MRI studies have yet located the explanatory data that identifies the cause or the neurobiology of autism. Similarly, studies of the schizoid have yet to group that condition with those found in schizophrenia. Through presentations of clinical examples, the treatment of autists of the Asperger type is revealed to address the autist’s extreme social aversion which also precludes the experience of empathy. Autists will be revealed as forming social attachments but without the capacity to interact with mutual concern. Empathy will be shown be teachable and, as social avoidance relents, understanding of the meaning and signs of empathic needs that autists can recognize and acknowledge. Treatment of schizoids will be shown to revolve around joining empathically with the schizoid’s apprehensions about interpersonal, interactive proximity. Models of both autism and schizoid personality traits have yet to be replicated in animals, thereby eliminating the role of translational research in providing the kind of clues to behavioral patterns that can be related to genetic, epigenetic and neurobiological measures. But as these clinical examples will attest, treatment strategies have significant impact.Keywords: autism spectrum, schizoid personality traits, neurobiological implications, critical diagnostic distinctions
Procedia PDF Downloads 11491 Assessment of Surface Water Quality near Landfill Sites Using a Water Pollution Index
Authors: Alejandro Cittadino, David Allende
Abstract:
Landfilling of municipal solid waste is a common waste management practice in Argentina as in many parts of the world. There is extensive scientific literature on the potential negative effects of landfill leachates on the environment, so it’s necessary to be rigorous with the control and monitoring systems. Due to the specific municipal solid waste composition in Argentina, local landfill leachates contain large amounts of organic matter (biodegradable, but also refractory to biodegradation), as well as ammonia-nitrogen, small trace of some heavy metals, and inorganic salts. In order to investigate the surface water quality in the Reconquista river adjacent to the Norte III landfill, water samples both upstream and downstream the dumpsite are quarterly collected and analyzed for 43 parameters including organic matter, heavy metals, and inorganic salts, as required by the local standards. The objective of this study is to apply a water quality index that considers the leachate characteristics in order to determine the quality status of the watercourse through the landfill. The water pollution index method has been widely used in water quality assessments, particularly rivers, and it has played an increasingly important role in water resource management, since it provides a number simple enough for the public to understand, that states the overall water quality at a certain location and time. The chosen water quality index (ICA) is based on the values of six parameters: dissolved oxygen (in mg/l and percent saturation), temperature, biochemical oxygen demand (BOD5), ammonia-nitrogen and chloride (Cl-) concentration. The index 'ICA' was determined both upstream and downstream the Reconquista river, being the rating scale between 0 (very poor water quality) and 10 (excellent water quality). The monitoring results indicated that the water quality was unaffected by possible leachate runoff since the index scores upstream and downstream were ranked in the same category, although in general, most of the samples were classified as having poor water quality according to the index’s scale. The annual averaged ICA index scores (computed quarterly) were 4.9, 3.9, 4.4 and 5.0 upstream and 3.9, 5.0, 5.1 and 5.0 downstream the river during the study period between 2014 and 2017. Additionally, the water quality seemed to exhibit distinct seasonal variations, probably due to annual precipitation patterns in the study area. The ICA water quality index appears to be appropriate to evaluate landfill impacts since it accounts mainly for organic pollution and inorganic salts and the absence of heavy metals in the local leachate composition, however, the inclusion of other parameters could be more decisive in discerning the affected stream reaches from the landfill activities. A future work may consider adding to the index other parameters like total organic carbon (TOC) and total suspended solids (TSS) since they are present in the leachate in high concentrations.Keywords: landfill, leachate, surface water, water quality index
Procedia PDF Downloads 15490 Characterization of Thin Woven Composites Used in Printed Circuit Boards by Combining Numerical and Experimental Approaches
Authors: Gautier Girard, Marion Martiny, Sebastien Mercier, Mohamad Jrad, Mohamed-Slim Bahi, Laurent Bodin, Francois Lechleiter, David Nevo, Sophie Dareys
Abstract:
Reliability of electronic devices has always been of highest interest for Aero-MIL and space applications. In any electronic device, Printed Circuit Board (PCB), providing interconnection between components, is a key for reliability. During the last decades, PCB technologies evolved to sustain and/or fulfill increased original equipment manufacturers requirements and specifications, higher densities and better performances, faster time to market and longer lifetime, newer material and mixed buildups. From the very beginning of the PCB industry up to recently, qualification, experiments and trials, and errors were the most popular methods to assess system (PCB) reliability. Nowadays OEM, PCB manufacturers and scientists are working together in a close relationship in order to develop predictive models for PCB reliability and lifetime. To achieve that goal, it is fundamental to characterize precisely base materials (laminates, electrolytic copper, …), in order to understand failure mechanisms and simulate PCB aging under environmental constraints by means of finite element method for example. The laminates are woven composites and have thus an orthotropic behaviour. The in-plane properties can be measured by combining classical uniaxial testing and digital image correlation. Nevertheless, the out-of-plane properties cannot be evaluated due to the thickness of the laminate (a few hundred of microns). It has to be noted that the knowledge of the out-of-plane properties is fundamental to investigate the lifetime of high density printed circuit boards. A homogenization method combining analytical and numerical approaches has been developed in order to obtain the complete elastic orthotropic behaviour of a woven composite from its precise 3D internal structure and its experimentally measured in-plane elastic properties. Since the mechanical properties of the resin surrounding the fibres are unknown, an inverse method is proposed to estimate it. The methodology has been applied to one laminate used in hyperfrequency spatial applications in order to get its elastic orthotropic behaviour at different temperatures in the range [-55°C; +125°C]. Next; numerical simulations of a plated through hole in a double sided PCB are performed. Results show the major importance of the out-of-plane properties and the temperature dependency of these properties on the lifetime of a printed circuit board. Acknowledgements—The support of the French ANR agency through the Labcom program ANR-14-LAB7-0003-01, support of CNES, Thales Alenia Space and Cimulec is acknowledged.Keywords: homogenization, orthotropic behaviour, printed circuit board, woven composites
Procedia PDF Downloads 20589 Development of Bilayer Coating System for Mitigating Corrosion of Offshore Wind Turbines
Authors: Adamantini Loukodimou, David Weston, Shiladitya Paul
Abstract:
Offshore structures are subjected to harsh environments. It is documented that carbon steel needs protection from corrosion. The combined effect of UV radiation, seawater splash, and fluctuating temperatures diminish the integrity of these structures. In addition, the possibility of damage caused by floating ice, seaborne debris, and maintenance boats make them even more vulnerable. Their inspection and maintenance when far out in the sea are difficult, risky, and expensive. The most known method of mitigating corrosion of offshore structures is the use of cathodic protection. There are several zones in an offshore wind turbine. In the atmospheric zone, due to the lack of a continuous electrolyte (seawater) layer between the structure and the anode at all times, this method proves inefficient. Thus, the use of protective coatings becomes indispensable. This research focuses on the atmospheric zone. The conversion of commercially available and conventional paint (epoxy) system to an autonomous self-healing paint system via the addition of suitable encapsulated healing agents and catalyst is investigated in this work. These coating systems, which can self-heal when damaged, can provide a cost-effective engineering solution to corrosion and related problems. When the damage of the paint coating occurs, the microcapsules are designed to rupture and release the self-healing liquid (monomer), which then will react in the presence of the catalyst and solidify (polymerization), resulting in healing. The catalyst should be compatible with the system because otherwise, the self-healing process will not occur. The carbon steel substrate will be exposed to a corrosive environment, so the use of a sacrificial layer of Zn is also investigated. More specifically, the first layer of this new coating system will be TSZA (Thermally Sprayed Zn85/Al15) and will be applied on carbon steel samples with dimensions 100 x 150 mm after being blasted with alumina (size F24) as part of the surface preparation. Based on the literature, it corrodes readily, so one additional paint layer enriched with microcapsules will be added. Also, the reaction and the curing time are of high importance in order for this bilayer system of coating to work successfully. For the first experiments, polystyrene microcapsules loaded with 3-octanoyltio-1-propyltriethoxysilane were conducted. Electrochemical experiments such as Electrochemical Impedance Spectroscopy (EIS) confirmed the corrosion inhibiting properties of the silane. The diameter of the microcapsules was about 150-200 microns. Further experiments were conducted with different reagents and methods in order to obtain diameters of about 50 microns, and their self-healing properties were tested in synthetic seawater using electrochemical techniques. The use of combined paint/electrodeposited coatings allows for further novel development of composite coating systems. The potential for the application of these coatings in offshore structures will be discussed.Keywords: corrosion mitigation, microcapsules, offshore wind turbines, self-healing
Procedia PDF Downloads 11588 Finite Element Analysis of the Anaconda Device: Efficiently Predicting the Location and Shape of a Deployed Stent
Authors: Faidon Kyriakou, William Dempster, David Nash
Abstract:
Abdominal Aortic Aneurysm (AAA) is a major life-threatening pathology for which modern approaches reduce the need for open surgery through the use of stenting. The success of stenting though is sometimes jeopardized by the final position of the stent graft inside the human artery which may result in migration, endoleaks or blood flow occlusion. Herein, a finite element (FE) model of the commercial medical device AnacondaTM (Vascutek, Terumo) has been developed and validated in order to create a numerical tool able to provide useful clinical insight before the surgical procedure takes place. The AnacondaTM device consists of a series of NiTi rings sewn onto woven polyester fabric, a structure that despite its column stiffness is flexible enough to be used in very tortuous geometries. For the purposes of this study, a FE model of the device was built in Abaqus® (version 6.13-2) with the combination of beam, shell and surface elements; the choice of these building blocks was made to keep the computational cost to a minimum. The validation of the numerical model was performed by comparing the deployed position of a full stent graft device inside a constructed AAA with a duplicate set-up in Abaqus®. Specifically, an AAA geometry was built in CAD software and included regions of both high and low tortuosity. Subsequently, the CAD model was 3D printed into a transparent aneurysm, and a stent was deployed in the lab following the steps of the clinical procedure. Images on the frontal and sagittal planes of the experiment allowed the comparison with the results of the numerical model. By overlapping the experimental and computational images, the mean and maximum distances between the rings of the two models were measured in the longitudinal, and the transverse direction and, a 5mm upper bound was set as a limit commonly used by clinicians when working with simulations. The two models showed very good agreement of their spatial positioning, especially in the less tortuous regions. As a result, and despite the inherent uncertainties of a surgical procedure, the FE model allows confidence that the final position of the stent graft, when deployed in vivo, can also be predicted with significant accuracy. Moreover, the numerical model run in just a few hours, an encouraging result for applications in the clinical routine. In conclusion, the efficient modelling of a complicated structure which combines thin scaffolding and fabric has been demonstrated to be feasible. Furthermore, the prediction capabilities of the location of each stent ring, as well as the global shape of the graft, has been shown. This can allow surgeons to better plan their procedures and medical device manufacturers to optimize their designs. The current model can further be used as a starting point for patient specific CFD analysis.Keywords: AAA, efficiency, finite element analysis, stent deployment
Procedia PDF Downloads 19387 Ecological and Historical Components of the Cultural Code of the City of Florence as Part of the Edutainment Project Velonotte International
Authors: Natalia Zhabo, Sergey Nikitin, Marina Avdonina, Mariya Nikitina
Abstract:
The analysis of the activities of one of the events of the international educational and entertainment project Velonotte is provided: an evening bicycle tour with children around Florence. The aim of the project is to develop methods and techniques for increasing the sensitivity of the cycling participants and listeners of the radio broadcasts to the treasures of the national heritage, in this case, to the historical layers of the city and the ecology of the Renaissance epoch. The block of educational tasks is considered, and the issues of preserving the identity of the city are discussed. Methods. The Florentine event was prepared during more than a year. First of all the creative team selected such events of the history of the city which seem to be important for revealing the specifics of the city, its spirit - from antiquity to our days – including the forums of Internet with broad public opinion. Then a route (seven kilometers) was developed, which was proposed to the authorities and organizations of the city. The selection of speakers was conducted according to several criteria: they should be authors of books, famous scientists, connoisseurs in a certain sphere (toponymy, history of urban gardens, art history), capable and willing to talk with participants directly at the points of stops, in order to make a dialogue and so that performances could be organized with their participation. The music was chosen for each part of the itinerary to prepare the audience emotionally. Cards for coloring with images of the main content of each stop were created for children. A site was done to inform the participants and to keep photos, videos and the audio files with speakers’ speech afterward. Results: Held in April 2017, the event was dedicated to the 640th Anniversary of the Filippo Brunelleschi, Florentine architect, and to the 190th anniversary of the publication of Florence guide by Stendhal. It was supported by City of Florence and Florence Bike Festival. Florence was explored to transfer traditional elements of culture, sometimes unfairly forgotten from ancient times to Brunelleschi and Michelangelo and Tschaikovsky and David Bowie with lectures by professors of Universities. Memorable art boards were installed in public spaces. Elements of the cultural code are deeply internalized in the minds of the townspeople, the perception of the city in everyday life and human communication is comparable to such fundamental concepts of the self-awareness of the townspeople as mental comfort and the level of happiness. The format of a fun and playful walk with the ICT support gives new opportunities for enriching the city's cultural code of each citizen with new components, associations, connotations.Keywords: edutainment, cultural code, cycling, sensitization Florence
Procedia PDF Downloads 22186 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 12985 Interacting with Multi-Scale Structures of Online Political Debates by Visualizing Phylomemies
Authors: Quentin Lobbe, David Chavalarias, Alexandre Delanoe
Abstract:
The ICT revolution has given birth to an unprecedented world of digital traces and has impacted a wide number of knowledge-driven domains such as science, education or policy making. Nowadays, we are daily fueled by unlimited flows of articles, blogs, messages, tweets, etc. The internet itself can thus be considered as an unsteady hyper-textual environment where websites emerge and expand every day. But there are structures inside knowledge. A given text can always be studied in relation to others or in light of a specific socio-cultural context. By way of their textual traces, human beings are calling each other out: hypertext citations, retweets, vocabulary similarity, etc. We are in fact the architects of a giant web of elements of knowledge whose structures and shapes convey their own information. The global shapes of these digital traces represent a source of collective knowledge and the question of their visualization remains an opened challenge. How can we explore, browse and interact with such shapes? In order to navigate across these growing constellations of words and texts, interdisciplinary innovations are emerging at the crossroad between fields of social and computational sciences. In particular, complex systems approaches make it now possible to reconstruct the hidden structures of textual knowledge by means of multi-scale objects of research such as semantic maps and phylomemies. The phylomemy reconstruction is a generic method related to the co-word analysis framework. Phylomemies aim to reveal the temporal dynamics of large corpora of textual contents by performing inter-temporal matching on extracted knowledge domains in order to identify their conceptual lineages. This study aims to address the question of visualizing the global shapes of online political discussions related to the French presidential and legislative elections of 2017. We aim to build phylomemies on top of a dedicated collection of thousands of French political tweets enriched with archived contemporary news web articles. Our goal is to reconstruct the temporal evolution of online debates fueled by each political community during the elections. To that end, we want to introduce an iterative data exploration methodology implemented and tested within the free software Gargantext. There we combine synchronic and diachronic axis of visualization to reveal the dynamics of our corpora of tweets and web pages as well as their inner syntagmatic and paradigmatic relationships. In doing so, we aim to provide researchers with innovative methodological means to explore online semantic landscapes in a collaborative and reflective way.Keywords: online political debate, French election, hyper-text, phylomemy
Procedia PDF Downloads 18684 Sustainable Production of Pharmaceutical Compounds Using Plant Cell Culture
Authors: David A. Ullisch, Yantree D. Sankar-Thomas, Stefan Wilke, Thomas Selge, Matthias Pump, Thomas Leibold, Kai Schütte, Gilbert Gorr
Abstract:
Plants have been considered as a source of natural substances for ages. Secondary metabolites from plants are utilized especially in medical applications but are more and more interesting as cosmetical ingredients and in the field of nutraceuticals. However, supply of compounds from natural harvest can be limited by numerous factors i.e. endangered species, low product content, climate impacts and cost intensive extraction. Especially in the pharmaceutical industry the ability to provide sufficient amounts of product and high quality are additional requirements which in some cases are difficult to fulfill by plant harvest. Whereas in many cases the complexity of secondary metabolites precludes chemical synthesis on a reasonable commercial basis, plant cells contain the biosynthetic pathway – a natural chemical factory – for a given compound. A promising approach for the sustainable production of natural products can be plant cell fermentation (PCF®). A thoroughly accomplished development process comprises the identification of a high producing cell line, optimization of growth and production conditions, the development of a robust and reliable production process and its scale-up. In order to address persistent, long lasting production, development of cryopreservation protocols and generation of working cell banks is another important requirement to be considered. So far the most prominent example using a PCF® process is the production of the anticancer compound paclitaxel. To demonstrate the power of plant suspension cultures here we present three case studies: 1) For more than 17 years Phyton produces paclitaxel at industrial scale i.e. up to 75,000 L in scale. With 60 g/kg dw this fully controlled process which is applied according to GMP results in outstanding high yields. 2) Thapsigargin is another anticancer compound which is currently isolated from seeds of Thapsia garganica. Thapsigargin is a powerful cytotoxin – a SERCA inhibitor – and the precursor for the derivative ADT, the key ingredient of the investigational prodrug Mipsagargin (G-202) which is in several clinical trials. Phyton successfully generated plant cell lines capable to express this compound. Here we present data about the screening for high producing cell lines. 3) The third case study covers ingenol-3-mebutate. This compound is found in the milky sap of the intact plants of the Euphorbiacae family at very low concentrations. Ingenol-3-mebutate is used in Picato® which is approved against actinic keratosis. Generation of cell lines expressing significant amounts of ingenol-3-mebutate is another example underlining the strength of plant cell culture. The authors gratefully acknowledge Inspyr Therapeutics for funding.Keywords: Ingenol-3-mebutate, plant cell culture, sustainability, thapsigargin
Procedia PDF Downloads 25183 Possibilities of Psychodiagnostics in the Context of Highly Challenging Situations in Military Leadership
Authors: Markéta Chmelíková, David Ullrich, Iva Burešová
Abstract:
The paper maps the possibilities and limits of diagnosing selected personality and performance characteristics of military leadership and psychology students in the context of coping with challenging situations. Individuals vary greatly inter-individually in their ability to effectively manage extreme situations, yet existing diagnostic tools are often criticized mainly for their low predictive power. Nowadays, every modern army focuses primarily on the systematic minimization of potential risks, including the prediction of desirable forms of behavior and the performance of military commanders. The context of military leadership is well known for its life-threatening nature. Therefore, it is crucial to research stress load in the specific context of military leadership for the purpose of possible anticipation of human failure in managing extreme situations of military leadership. The aim of the submitted pilot study, using an experiment of 24 hours duration, is to verify the possibilities of a specific combination of psychodiagnostic to predict people who possess suitable equipment for coping with increased stress load. In our pilot study, we conducted an experiment of 24 hours duration with an experimental group (N=13) in the bomb shelter and a control group (N=11) in a classroom. Both groups were represented by military leadership students (N=11) and psychology students (N=13). Both groups were equalized in terms of study type and gender. Participants were administered the following test battery of personality characteristics: Big Five Inventory 2 (BFI-2), Short Dark Triad (SD-3), Emotion Regulation Questionnaire (ERQ), Fatigue Severity Scale (FSS), and Impulsive Behavior Scale (UPPS-P). This test battery was administered only once at the beginning of the experiment. Along with this, they were administered a test battery consisting of the Test of Attention (d2) and the Bourdon test four times overall with 6 hours ranges. To better simulate an extreme situation – we tried to induce sleep deprivation - participants were required to try not to fall asleep throughout the experiment. Despite the assumption that a stay in an underground bomb shelter will manifest in impaired cognitive performance, this expectation has been significantly confirmed in only one measurement, which can be interpreted as marginal in the context of multiple testing. This finding is a fundamental insight into the issue of stress management in extreme situations, which is crucial for effective military leadership. The results suggest that a 24-hour stay in a shelter, together with sleep deprivation, does not seem to simulate sufficient stress for an individual, which would be reflected in the level of cognitive performance. In the context of these findings, it would be interesting in future to extend the diagnostic battery with physiological indicators of stress, such as: heart rate, stress score, physical stress, mental stress ect.Keywords: bomb shelter, extreme situation, military leadership, psychodiagnostic
Procedia PDF Downloads 9182 Pentosan Polysulfate Sodium: A Potential Treatment to Improve Bone and Joint Manifestations of Mucopolysaccharidosis I
Authors: Drago Bratkovic, Curtis Gravance, David Ketteridge, Ravi Krishnan, Michael Imperiale
Abstract:
The mucopolysaccharidoses (MPSs) are a group of lysosomal storage diseases that have a common defect in the catabolism of glycosaminoglycans (GAGs). MPS I is the most common of the MPS diseases. Manifestations of MPS I include coarsening of facial features, corneal clouding, developmental delay, short stature, skeletal manifestations, hearing loss, cardiac valve disease, hepatosplenomegaly, and umbilical and inguinal hernias. Treatments for MPS I restore or activate the missing or deficient enzyme in the case of enzyme replacement therapy (ERT) and haematopoietic stem cell transplantation (HSCT). Pentosan polysulfate sodium (PPS) is a potential treatment to improve bone and joint manifestations of MPS I. The mechanisms of action of PPS that are relevant to the treatment of MPS I are the ability to: (i) Reduce systemic and accumulated GAG, (ii) Reduce inflammatory effects via the inhibition of NF-kB, resulting in the reduction in pro-inflammatory mediators. (iii) Reduce the expression of the pain mediator nerve growth factor in osteocytes from degenerating joints. (iv) Inhibit the cartilage degrading enzymes related to joint dysfunction in MPS I. PPS is being evaluated as an adjunctive therapy to ERT and/or HSCT in an open-label, single-centre, phase 2 study. Patients are ≥ 5 years of age with a diagnosis of MPS I and previously received HSCT and/or ERT. Three white, female, patients with MPS I-Hurler, ages 14, 15, and 19 years, and one, white male patient aged 15 years are enrolled. All were diagnosed at ≤2 years of age. All patients received HSCT ≤ 6 months after diagnosis. Two of the patients were treated with ERT prior to HSCT, and 1 patient received ERT commencing 3 months prior to HSCT. Two patients received 0.75mg/kg and 2 patients received 1.5mg/kg of PPS. PPS was well tolerated at doses of 0.75 and 1.5 mg/kg to 47 weeks of continuous dosing. Of the 19 adverse events (AEs), 2 were related to PPS. One AE was moderate (pre-syncope) and 1 was mild (injection site bruising), experienced in the same patient. All AEs were reported as mild or moderate. There have been no SAEs. One subject experienced a COVID-19 infection and PPS was interrupted. The MPS I signature GAG fragments, sulfated disaccharide and UA-HNAc S, tended to decrease in 3 patients from baseline through Week 25. Week 25 GAG data are pending for the 4th patient. Overall, most biomarkers (inflammatory, cartilage degeneration, and bone turnover) evaluated in the 3 patients with 25-week assessments have indicated either no change or a reduction in levels compared to baseline. In 3 patients, there was a trend toward improvement in the 2MWT from baseline to Week 48 with > 100% increase in 1 patient (01-201). In the 3 patients that had Week 48 assessments, patients and proxies reported improvement in PGIC, including “worthwhile difference” (n=1), or “made all the difference” (n=2).Keywords: MPS I, pentosan polysulfate sodium, clinical study, 2MWT, QoL
Procedia PDF Downloads 11281 Transcription Skills and Written Composition in Chinese
Authors: Pui-sze Yeung, Connie Suk-han Ho, David Wai-ock Chan, Kevin Kien-hoa Chung
Abstract:
Background: Recent findings have shown that transcription skills play a unique and significant role in Chinese word reading and spelling (i.e. word dictation), and written composition development. The interrelationships among component skills of transcription, word reading, word spelling, and written composition in Chinese have rarely been examined in the literature. Is the contribution of component skills of transcription to Chinese written composition mediated by word level skills (i.e., word reading and spelling)? Methods: The participants in the study were 249 Chinese children in Grade 1, Grade 3, and Grade 5 in Hong Kong. They were administered measures of general reasoning ability, orthographic knowledge, stroke sequence knowledge, word spelling, handwriting fluency, word reading, and Chinese narrative writing. Orthographic knowledge- orthographic knowledge was assessed by a task modeled after the lexical decision subtest of the Hong Kong Test of Specific Learning Difficulties in Reading and Writing (HKT-SpLD). Stroke sequence knowledge: The participants’ performance in producing legitimate stroke sequences was measured by a stroke sequence knowledge task. Handwriting fluency- Handwriting fluency was assessed by a task modeled after the Chinese Handwriting Speed Test. Word spelling: The stimuli of the word spelling task consist of fourteen two-character Chinese words. Word reading: The stimuli of the word reading task consist of 120 two-character Chinese words. Written composition: A narrative writing task was used to assess the participants’ text writing skills. Results: Analysis of covariance results showed that there were significant between-grade differences in the performance of word reading, word spelling, handwriting fluency, and written composition. Preliminary hierarchical multiple regression analysis results showed that orthographic knowledge, word spelling, and handwriting fluency were unique predictors of Chinese written composition even after controlling for age, IQ, and word reading. The interaction effects between grade and each of these three skills (orthographic knowledge, word spelling, and handwriting fluency) were not significant. Path analysis results showed that orthographic knowledge contributed to written composition both directly and indirectly through word spelling, while handwriting fluency contributed to written composition directly and indirectly through both word reading and spelling. Stroke sequence knowledge only contributed to written composition indirectly through word spelling. Conclusions: Preliminary hierarchical regression results were consistent with previous findings about the significant role of transcription skills in Chinese word reading, spelling and written composition development. The fact that orthographic knowledge contributed both directly and indirectly to written composition through word reading and spelling may reflect the impact of the script-sound-meaning convergence of Chinese characters on the composing process. The significant contribution of word spelling and handwriting fluency to Chinese written composition across elementary grades highlighted the difficulty in attaining automaticity of transcription skills in Chinese, which limits the working memory resources available for other composing processes.Keywords: orthographic knowledge, transcription skills, word reading, writing
Procedia PDF Downloads 42580 Functional Analysis of Variants Implicated in Hearing Loss in a Cohort from Argentina: From Molecular Diagnosis to Pre-Clinical Research
Authors: Paula I. Buonfiglio, Carlos David Bruque, Lucia Salatino, Vanesa Lotersztein, Sebastián Menazzi, Paola Plazas, Ana Belén Elgoyhen, Viviana Dalamón
Abstract:
Hearing loss (HL) is the most prevalent sensorineural disorder affecting about 10% of the global population, with more than half due to genetic causes. About 1 in 500-1000 newborns present congenital HL. Most of the patients are non-syndromic with an autosomal recessive mode of inheritance. To date, more than 100 genes are related to HL. Therefore, the Whole-exome sequencing (WES) technique has become a cost-effective alternative approach for molecular diagnosis. Nevertheless, new challenges arise from the detection of novel variants, in particular missense changes, which can lead to a spectrum of genotype-to-phenotype correlations, which is not always straightforward. In this work, we aimed to identify the genetic causes of HL in isolated and familial cases by designing a multistep approach to analyze target genes related to hearing impairment. Moreover, we performed in silico and in vivo analyses in order to further study the effect of some of the novel variants identified in the hair cell function using the zebrafish model. A total of 650 patients were studied by Sanger Sequencing and Gap-PCR in GJB2 and GJB6 genes, respectively, diagnosing 15.5% of sporadic cases and 36% of familial ones. Overall, 50 different sequence variants were detected. Fifty of the undiagnosed patients with moderate HL were tested for deletions in STRC gene by Multiplex ligation-dependent probe amplification technique (MLPA), leading to 6% of diagnosis. After this initial screening, 50 families were selected to be analyzed by WES, achieving diagnosis in 44% of them. Half of the identified variants were novel. A missense variant in MYO6 gene detected in a family with postlingual HL was selected to be further analyzed. A protein modeling with AlphaFold2 software was performed, proving its pathogenic effect. In order to functionally validate this novel variant, a knockdown phenotype rescue assay in zebrafish was carried out. Injection of wild-type MYO6 mRNA in embryos rescued the phenotype, whereas using the mutant MYO6 mRNA (carrying c.2782C>A variant) had no effect. These results strongly suggest the deleterious effect of this variant on the mobility of stereocilia in zebrafish neuromasts, and hence on the auditory system. In the present work, we demonstrated that our algorithm is suitable for the sequential multigenic approach to HL in our cohort. These results highlight the importance of a combined strategy in order to identify candidate variants as well as the in silico and in vivo studies to analyze and prove their pathogenicity and accomplish a better understanding of the mechanisms underlying the physiopathology of the hearing impairment.Keywords: diagnosis, genetics, hearing loss, in silico analysis, in vivo analysis, WES, zebrafish
Procedia PDF Downloads 9579 Dys-Regulation of Immune and Inflammatory Response in in vitro Fertilization Implantation Failure Patients under Ovarian Stimulation
Authors: Amruta D. S. Pathare, Indira Hinduja, Kusum Zaveri
Abstract:
Implantation failure (IF) even after the good-quality embryo transfer (ET) in the physiologically normal endometrium is the main obstacle in in vitro fertilization (IVF). Various microarray studies have been performed worldwide to elucidate the genes requisite for endometrial receptivity. These studies have included the population based on different phases of menstrual cycle during natural cycle and stimulated cycle in normal fertile women. Additionally, the literature is also available in recurrent implantation failure patients versus oocyte donors in natural cycle. However, for the first time, we aim to study the genomics of endometrial receptivity in IF patients under controlled ovarian stimulation (COS) during which ET is generally practised in IVF. Endometrial gene expression profiling in IF patients (n=10) and oocyte donors (n=8) were compared during window of implantation under COS by whole genome microarray (using Illumina platform). Enrichment analysis of microarray data was performed to determine dys-regulated biological functions and pathways using Database for Annotation, Visualization and Integrated Discovery, v6.8 (DAVID). The enrichment mapping was performed with the help of Cytoscape software. Microarray results were validated by real-time PCR. Localization of genes related to immune response (Progestagen-Associated Endometrial Protein (PAEP), Leukaemia Inhibitory Factor (LIF), Interleukin-6 Signal Transducer (IL6ST) was detected by immunohistochemistry. The study revealed 418 genes downregulated and 519 genes upregulated in IF patients compared to healthy fertile controls. The gene ontology, pathway analysis and enrichment mapping revealed significant downregulation in activation and regulation of immune and inflammation response in IF patients under COS. The lower expression of Progestagen Associated Endometrial Protein (PAEP), Leukemia Inhibitory Factor (LIF) and Interleukin 6 Signal Transducer (IL6ST) in cases compared to controls by real time and immunohistochemistry suggests the functional importance of these genes. The study was proved useful to uncover the probable reason of implantation failure being imbalance of immune and inflammatory regulation in our group of subjects. Based on the present study findings, a panel of significant dysregulated genes related to immune and inflammatory pathways needs to be further substantiated in larger cohort in natural as well as stimulated cycle. Upon which these genes could be screened in IF patients during window of implantation (WOI) before going for embryo transfer or any other immunological treatment. This would help to estimate the regulation of specific immune response during WOI in a patient. The appropriate treatment of either activation of immune response or suppression of immune response can be then attempted in IF patients to enhance the receptivity of endometrium.Keywords: endometrial receptivity, immune and inflammatory response, gene expression microarray, window of implantation
Procedia PDF Downloads 15678 HyDUS Project; Seeking a Wonder Material for Hydrogen Storage
Authors: Monica Jong, Antonios Banos, Tom Scott, Chris Webster, David Fletcher
Abstract:
Hydrogen, as a clean alternative to methane, is relatively easy to make, either from water using electrolysis or from methane using steam reformation. However, hydrogen is much trickier to store than methane, and without effective storage, it simply won’t pass muster as a suitable methane substitute. Physical storage of hydrogen is quite inefficient. Storing hydrogen as a compressed gas at pressures up to 900 times atmospheric is volumetrically inefficient and carries safety implications, whilst storing it as a liquid requires costly and constant cryogenic cooling to minus 253°C. This is where DU steps in as a possible solution. Across the periodic table, there are many different metallic elements that will react with hydrogen to form a chemical compound known as a hydride (or metal hydride). From a chemical perspective, the ‘king’ of the hydride forming metals is palladium because it offers the highest hydrogen storage volumetric capacity. However, this material is simply too expensive and scarce to be used in a scaled-up bulk hydrogen storage solution. Depleted Uranium is the second most volumetrically efficient hydride-forming metal after palladium. The UK has accrued a significant amount of DU because of manufacturing nuclear fuel for many decades, and that is currently without real commercial use. Uranium trihydride (UH3) contains three hydrogen atoms for every uranium atom and can chemically store hydrogen at ambient pressure and temperature at more than twice the density of pure liquid hydrogen for the same volume. To release the hydrogen from the hydride, all you do is heat it up. At temperatures above 250°C, the hydride starts to thermally decompose, releasing hydrogen as a gas and leaving the Uranium as a metal again. The reversible nature of this reaction allows the hydride to be formed and unformed again and again, enabling its use as a high-density hydrogen storage material which is already available in large quantities because of its stockpiling as a ‘waste’ by-product. Whilst the tritium storage credentials of Uranium have been rigorously proven at the laboratory scale and at the fusion demonstrator JET for over 30 years, there is a need to prove the concept for depleted uranium hydrogen storage (HyDUS) at scales towards that which is needed to flexibly supply our national power grid with energy. This is exactly the purpose of the HyDUS project, a collaborative venture involving EDF as the interested energy vendor, Urenco as the owner of the waste DU, and the University of Bristol with the UKAEA as the architects of the technology. The team will embark on building and proving the world’s first pilot scale demonstrator of bulk chemical hydrogen storage using depleted Uranium. Within 24 months, the team will attempt to prove both the technical and commercial viability of this technology as a longer duration energy storage solution for the UK. The HyDUS project seeks to enable a true by-product to wonder material story for depleted Uranium, demonstrating that we can think sustainably about unlocking the potential value trapped inside nuclear waste materials.Keywords: hydrogen, long duration storage, storage, depleted uranium, HyDUS
Procedia PDF Downloads 16077 Lactic Acid Solution and Aromatic Vinegar Nebulization to Improve Hunted Wild Boar Carcass Hygiene at Game-Handling Establishment: Preliminary Results
Authors: Rossana Roila, Raffaella Branciari, Lorenzo Cardinali, David Ranucci
Abstract:
The wild boar (Sus scrofa) population has strongly increased across Europe in the last decades, also causing severe fauna management issues. In central Italy, wild boar is the main hunted wild game species, with approximately 40,000 animals killed per year only in the Umbria region. The meat of the game is characterized by high-quality nutritional value as well as peculiar taste and aroma, largely appreciated by consumers. This type of meat and products thereof can meet the current consumers’ demand for higher quality foodstuff, not only from a nutritional and sensory point of view but also in relation to environmental sustainability, the non-use of chemicals, and animal welfare. The game meat production chain is characterized by some gaps from a hygienic point of view: the harvest process is usually conducted in a wild environment where animals can be more easily contaminated during hunting and subsequent practices. The definition and implementation of a certified and controlled supply chain could ensure quality, traceability and safety for the final consumer and therefore promote game meat products. According to European legislation in some animal species, such as bovine, the use of weak acid solutions for carcass decontamination is envisaged in order to ensure the maintenance of optimal hygienic characteristics. A preliminary study was carried out to evaluate the applicability of similar strategies to control the hygienic level of wild boar carcasses. The carcasses, harvested according to the selective method and processed into the game-handling establishment, were treated by nebulization with two different solutions: a 2% food-grade lactic acid solution and aromatic vinegar. Swab samples were performed before treatment and in different moments after-treatment of the carcasses surfaces and subsequently tested for Total Aerobic Mesophilic Load, Total Aerobic Psychrophilic Load, Enterobacteriaceae, Staphylococcus spp. and lactic acid bacteria. The results obtained for the targeted microbial populations showed a positive effect of the application of the lactic acid solution on all the populations investigated, while aromatic vinegar showed a lower effect on bacterial growth. This study could lay the foundations for the optimization of the use of a lactic acid solution to treat wild boar carcasses aiming to guarantee good hygienic level and safety of meat.Keywords: game meat, food safety, process hygiene criteria, microbial population, microbial growth, food control
Procedia PDF Downloads 15976 Emotion Regulation and Executive Functioning Scale for Children and Adolescents (REMEX): Scale Development
Authors: Cristina Costescu, Carmen David, Adrian Roșan
Abstract:
Executive functions (EF) and emotion regulation strategies are processes that allow individuals to function in an adaptative way and to be goal-oriented, which is essential for success in daily living activities, at school, or in social contexts. The Emotion Regulation and Executive Functioning Scale for Children and Adolescents (REMEX) represents an empirically based tool (based on the model of EF developed by Diamond) for evaluating significant dimensions of child and adolescent EFs and emotion regulation strategies, mainly in school contexts. The instrument measures the following dimensions: working memory, inhibition, cognitive flexibility, executive attention, planning, emotional control, and emotion regulation strategies. Building the instrument involved not only a top-down process, as we selected the content in accordance with prominent models of FE, but also a bottom-up one, as we were able to identify valid contexts in which FE and ER are put to use. For the construction of the instrument, we implemented three focus groups with teachers and other professionals since the aim was to develop an accurate, objective, and ecological instrument. We used the focus group method in order to address each dimension and to yield a bank of items to be further tested. Each dimension is addressed through a task that the examiner will apply and through several items derived from the main task. For the validation of the instrument, we plan to use item response theory (IRT), also known as the latent response theory, that attempts to explain the relationship between latent traits (unobservable cognitive processes) and their manifestations (i.e., observed outcomes, responses, or performance). REMEX represents an ecological scale that integrates a current scientific understanding of emotion regulation and EF and is directly applicable to school contexts, and it can be very useful for developing intervention protocols. We plan to test his convergent validity with the Childhood Executive Functioning Inventory (CHEXI) and Emotion Dysregulation Inventory (EDI) and divergent validity between a group of typically developing children and children with neurodevelopmental disorders, aged between 6 and 9 years old. In a previous pilot study, we enrolled a sample of 40 children with autism spectrum disorders and attention-deficit/hyperactivity disorder aged 6 to 12 years old, and we applied the above-mentioned scales (CHEXI and EDI). Our results showed that deficits in planning, bebavior regulation, inhibition, and working memory predict high levels of emotional reactivity, leading to emotional and behavioural problems. Considering previous results, we expect our findings to provide support for the validity and reliability of the REMEX version as an ecological instrument for assessing emotion regulation and EF in children and for key features of its uses in intervention protocols.Keywords: executive functions, emotion regulation, children, item response theory, focus group
Procedia PDF Downloads 10175 Exploratory Tests on Structures Resistance during Forest Fires
Authors: Luis M. Ribeiro, Jorge Raposo, Ricardo Oliveira, David Caballero, Domingos X. Viegas
Abstract:
Under the scope of European project WUIWATCH a set of experimental tests on house vulnerability was performed in order to assess the resistance of selected house components during the passage of a forest fire. Among the individual elements most affected by the passage of a wildfire the windows are the ones with greater exposure. In this sense, a set of exploratory experimental tests was designed to assess some particular aspects related to the vulnerability of windows and blinds. At the same time, the importance of leaving them closed (as well as the doors inside a house) during a wild fire was explored in order to give some scientific background to guidelines for homeowners. Three sets of tests were performed: 1. Windows and blinds resistance to heat. Three types of protective blinds were tested (aluminium, PVC and wood) on 2 types of windows (single and double pane). The objective was to assess the structures resistance. 2. The influence of air flow on the transport of burning embers inside a house. A room was built to scale, and placed inside a wind tunnel, with one window and one door on opposite sides. The objective was to assess the importance of leaving an inside door opened on the probability of burning embers entering the room. 3. The influence of the dimension of openings on a window or door related to the probability of ignition inside a house. The objective was to assess the influence of different window openings in relation to the amount of burning particles that can enter a house. The main results were: 1. The purely radiative heat source provides 1.5 KW/m2 of heat impact in the structure, while the real fire generates 10 Kw/m2. When protected by the blind, the single pane window reaches 30ºC on both sides, and the double pane window has a differential of 10º from the side facing the heat (30ºC) and the opposite side (40ºC). Unprotected window constantly increases temperature until the end of the test. Window blinds reach considerably higher temperatures. PVC loses its consistency above 150ºC and melts. 2. Leaving the inside door closed results in a positive pressure differential of +1Pa from the outside to the inside, inhibiting the air flow. Opening the door in half or full reverts the pressure differential to -6 and -8 times respectively, favouring the air flow from the outside to the inside. The number of particles entering the house follows the same tendency. 3. As the bottom opening in a window increases from 0,5 cm to 4 cm the number of particles that enter the house per second also increases greatly. From 5 cm until 80cm there is no substantial increase in the number of entering particles. This set of exploratory tests proved to be an added value in supporting guidelines for home owners, regarding self-protection in WUI areas.Keywords: forest fire, wildland urban interface, house vulnerability, house protective elements
Procedia PDF Downloads 28574 A Fast Multi-Scale Finite Element Method for Geophysical Resistivity Measurements
Authors: Mostafa Shahriari, Sergio Rojas, David Pardo, Angel Rodriguez- Rozas, Shaaban A. Bakr, Victor M. Calo, Ignacio Muga
Abstract:
Logging-While Drilling (LWD) is a technique to record down-hole logging measurements while drilling the well. Nowadays, LWD devices (e.g., nuclear, sonic, resistivity) are mostly used commercially for geo-steering applications. Modern borehole resistivity tools are able to measure all components of the magnetic field by incorporating tilted coils. The depth of investigation of LWD tools is limited compared to the thickness of the geological layers. Thus, it is a common practice to approximate the Earth’s subsurface with a sequence of 1D models. For a 1D model, we can reduce the dimensionality of the problem using a Hankel transform. We can solve the resulting system of ordinary differential equations (ODEs) either (a) analytically, which results in a so-called semi-analytic method after performing a numerical inverse Hankel transform, or (b) numerically. Semi-analytic methods are used by the industry due to their high performance. However, they have major limitations, namely: -The analytical solution of the aforementioned system of ODEs exists only for piecewise constant resistivity distributions. For arbitrary resistivity distributions, the solution of the system of ODEs is unknown by today’s knowledge. -In geo-steering, we need to solve inverse problems with respect to the inversion variables (e.g., the constant resistivity value of each layer and bed boundary positions) using a gradient-based inversion method. Thus, we need to compute the corresponding derivatives. However, the analytical derivatives of cross-bedded formation and the analytical derivatives with respect to the bed boundary positions have not been published to the best of our knowledge. The main contribution of this work is to overcome the aforementioned limitations of semi-analytic methods by solving each 1D model (associated with each Hankel mode) using an efficient multi-scale finite element method. The main idea is to divide our computations into two parts: (a) offline computations, which are independent of the tool positions and we precompute only once and use them for all logging positions, and (b) online computations, which depend upon the logging position. With the above method, (a) we can consider arbitrary resistivity distributions along the 1D model, and (b) we can easily and rapidly compute the derivatives with respect to any inversion variable at a negligible additional cost by using an adjoint state formulation. Although the proposed method is slower than semi-analytic methods, its computational efficiency is still high. In the presentation, we shall derive the mathematical variational formulation, describe the proposed multi-scale finite element method, and verify the accuracy and efficiency of our method by performing a wide range of numerical experiments and comparing the numerical solutions to semi-analytic ones when the latest are available.Keywords: logging-While-Drilling, resistivity measurements, multi-scale finite elements, Hankel transform
Procedia PDF Downloads 38773 Patterns of Change in Specific Behaviors of Autism Symptoms for Boys and for Girls Across Childhood
Authors: Einat Waizbard, Emilio Ferrer, Meghan Miller, Brianna Heath, Derek S. Andrews, Sally J. Rogers, Christine Wu Nordahl, Marjorie Solomon, David G. Amaral
Abstract:
Background: Autism symptoms are comprised of social-communication deficits and restricted/repetitive behaviors (RRB). The severity of these symptoms can change during childhood, with differences between boys and girls. From the literature, it was found that young autistic girls show a stronger tendency to decrease and a weaker tendency to increase their overall autism symptom severity levels compared to young autistic boys. It is not clear, however, which symptoms are driving these sex differences across childhood. In the current study, we evaluated the trajectories of independent autism symptoms across childhood and compared the patterns of change in such symptoms between boys and girls. Method: The study included 183 children diagnosed with autism (55 girls) evaluated three times across childhood, at ages 3, 6 and 11. We analyzed 22 independent items from the Autism Diagnostic Observation Scheudule-2 (ADOS-2), the gold-standard assessment tool for autism symptoms, each item representing a specific autism symptom. First, we used latent growth curve models to estimate the trajectories for the 22 ADOS-2 items for each child in the study. Second, we extracted the factor scores representing the individual slopes for each ADOS-2 item (i.e., slope representing that child’s change in that specific item). Third, we used factor analysis to identify common patterns of change among the ADOS-2 items, separately for boys and girls, i.e., which autism symptoms tend to change together and which change independently across childhood. Results: The best-emerging patterns for both boys and girls identified four common factors: three factors representative of changes in social-communication symptoms and one factor describing changes in RRB. Boys and girls showed the same pattern of change in RRB, with four items (e.g., speech abnormalities) changing together across childhood and three items (e.g., mannerisms) changing independently of other items. For social-communication deficits in boys, three factors were identified: the first factor included six items representing initiating and engaging in social-communication (e.g., quality of social overtures, conversation), the second factor included five items describing responsive social-communication (e.g., response to name) and the third factor included three items related to different aspects of social-communication (e.g., level of language). Girls’ social-communications deficits also loaded onto three factors: the first factor included five items (e.g., unusual eye contact), the second factor included six items (e.g., quality of social response), and the third factor included four items (e.g., showing). Some items showed similar patterns of change for both sexes (e.g., responsive joint attention), while other items showed differences (e.g., shared enjoyment). Conclusions: Girls and boys had different patterns of change in autism symptom severity across childhood. For RRB, both sexes showed similar patterns. For social-communication symptoms, however, there were both similarities and differences between boys and girls in the way symptoms changed over time. The strongest patterns of change were identified for initiating and engaging in social communication for boys and responsive social communication for girls.Keywords: autism spectrum disorder, autism symptom severity, symptom trajectories, sex differences
Procedia PDF Downloads 5272 Petrology, Geochemistry and Formation Conditions of Metaophiolites of the Loki Crystalline Massif (the Caucasus)
Authors: Irakli Gamkrelidze, David Shengelia, Tamara Tsutsunava, Giorgi Chichinadze, Giorgi Beridze, Ketevan Tedliashvili, Tamara Tsamalashvili
Abstract:
The Loki crystalline massif crops out in the Caucasian region and the geological retrospective represent the northern marginal part of the Baiburt-Sevanian terrain (island arc), bordering with the Paleotethys oceanic basin in the north. The pre-Alpine basement of the massif is built up of Lower-Middle Paleozoic metamorphic complex (metasedimentary and metabasite rocks), Upper Devonian quartz-diorites and Late Variscan granites. Earlier metamorphic complex was considered as an indivisible set including suites with different degree of metamorphism. Systematic geologic, petrologic and geochemical investigations of the massif’s rocks suggest the different conception on composition, structure and formation conditions of the massif. In particular, there are two main rock types in the Loki massif: the oldest autochthonous series of gneissic quartz-diorites and cutting them granites. The massif is flanked on its western side by a volcano-sedimentary sequence, metamorphosed to low-T facies. Petrologic, metamorphic and structural differences in this sequence prove the existence of a number of discrete units (overthrust sheets). One of them, the metabasic sheet represents the fragment of ophiolite complex. It comprises transition types of the second and third layers of the Paleooceanic crust: the upper noncumulated part of the third layer gabbro component and the following lowest part of the parallel diabase dykes of the second layer. The ophiolites are represented by metagabbros, metagabbro-diabases, metadiabases and amphibolite schists. According to the content of petrogenic components and additive elements in metabasites is stated that the protolith of metabasites belongs to petrochemical type of tholeiitic series of basalts. The parental magma of metaophiolites is of E-MORB composition, and by petrochemical parameters, it is very close to the composition of intraplate basalts. The dykes of hypabissal leucocratic siliceous and medium magmatic rocks associated with the metaophiolite sheet form the separate complex. They are granitoids with the extremely low content of CaO and quartz-diorite porphyries. According to various petrochemical parameters, these rocks have mixed characteristics. Their formation took place in spreading conditions or in the areas of manifestation of plumes most likely of island arc type. The metamorphism degree of the metaophiolites corresponds to a very low stage of green schist facies. The rocks of the metaophiolite complex are obducted from the Paleotethys Ocean. Geological and paleomagnetic data show that the primary location of the ocean is supposed to be to the north of the Loki crystalline massif.Keywords: the Caucasus, crystalline massif, ophiolites, tectonic sheet
Procedia PDF Downloads 27571 Linguistic and Cultural Human Rights for Indigenous Peoples in Education
Authors: David Hough
Abstract:
Indigenous peoples can generally be described as the original or first peoples of a land prior to colonization. While there is no single definition of indigenous peoples, the United Nations has developed a general understanding based on self-identification and historical continuity with pre-colonial societies. Indigenous peoples are often traditional holders of unique languages, knowledge systems and beliefs who possess valuable knowledge and practices which support sustainable management of natural resources. They often have social, economic, political systems, languages and cultures, which are distinct from dominant groups in the society or state where they live. They generally resist attempts by the dominant culture at assimilation and endeavour to maintain and reproduce their ancestral environments and systems as distinctive peoples and communities. In 2007, the United Nations General Assembly passed a declaration on the rights of indigenous peoples, known as UNDRIP. It (in addition to other international instruments such as ILO 169), sets out far-reaching guidelines, which – among other things – attempt to protect and promote indigenous languages and cultures. Paragraphs 13 and 14 of the declaration state the following regarding language, culture and education: Article 13, Paragraph 1: Indigenous peoples have the right to revitalize, use, develop and transmit for future generations their histories, languages, oral traditions, philosophies, writing systems, and literatures, and to designate and retain their own names for communities, places and persons. Article 14, Paragraph I: Indigenous peoples have the right to establish and control their educational systems and institutions providing education in their own languages, in a manner appropriate to their cultural methods of teaching and learning. These two paragraphs call for the right of self-determination in education. Paragraph 13 gives indigenous peoples the right to control the content of their teaching, while Paragraph 14 states that the teaching of this content should be based on methods of teaching and learning which are appropriate to indigenous peoples. This paper reviews an approach to furthering linguistic and cultural human rights for indigenous peoples in education, which supports UNDRIP. It has been employed in countries in Asia and the Pacific, including the Republic of the Marshall Islands, the Federated States of Micronesia, Far East Russia and Nepal. It is based on bottom-up community-based initiatives where students, teachers and local knowledge holders come together to produce classroom materials in their own languages that reflect their traditional beliefs and value systems. They may include such things as knowledge about herbal medicines and traditional healing practices, local history, numerical systems, weights and measures, astronomy and navigation, canoe building, weaving and mat making, life rituals, feasts, festivals, songs, poems, etc. Many of these materials can then be mainstreamed into math, science language arts and social studies classes.Keywords: Indigenous peoples, linguistic and cultural human rights, materials development, teacher training, traditional knowledge
Procedia PDF Downloads 25070 Breaching Treaty Obligations of the Rome Statute of the International Criminal Court: The Case of South Africa
Authors: David Abrahams
Abstract:
In October 2016 South Africa deposited its ‘instrument of withdrawal’ from the Rome Statute of the International Criminal Court, with the Secretary-General of the United Nations. The Rome Statute is the founding document of the treaty-based International Criminal Court (ICC). The ICC has jurisdiction to hear cases where crimes against humanity, war crimes and genocide have been committed, on the basis of individual criminal responsibility. It is therefore not surprising that one of the ICCs mandates is to ensure that the sufferings, due to gross human rights violations towards the civilian population is, in principle, brought to an end by punishing those individuals responsible, thus providing justice to the victims. The ICC is unable to effectively fulfill its mandate and thus depends, in part on the willingness of states to assist the Court in its functions. This requires states to ratify the Statute and to domesticate its provisions, depending on whether it is a monist or dualist state. South Africa ratified the Statute in November 2000, and domesticated the Statute in 2002 by virtue of the Implementation of the Rome Statute of the International Criminal Court Act 27 of 2002. South Africa thus remains under an obligation to cooperate with the ICC until the final date of withdrawal, which is October 2017. An AU Summit was hosted by South Africa during June 2015. Omar Al-Bashir, whom the prosecutor of the ICC has indicted on two separate occasions, was invited to the summit. South Africa made an agreement with the AU that it will honour its obligations in terms of its Diplomatic and Immunities Privileges Act of 2001, by granting immunity to all heads of state, including that of Sudan. This decision by South Africa has raised a plethora of questions regarding the status and hierarchy of international laws versus regional laws versus domestic laws. In particular, this paper explores whether a state’s international law treaty obligations may be suspended in favour of, firstly, regional peace (thus safeguarding the security of the civilian population against further atrocities and other gross violations of human rights), and secondly, head of state immunity. This paper also reflects on the effectiveness of the trias politca in South Africa in relation the manner in which South African courts have confirmed South Africa’s failure in fulfilling its obligations in terms of the Rome Statute. A secondary question which will also be explored, is whether the Rome Statute is currently an effective tool in dealing with gross violations of human rights, particularly in a regional African context, given the desire by a number of African states currently party to the Statute, to engage in a mass exodus from the Statute. Finally, the paper concludes with a proposal that there can be no justice for victims of gross human rights violations unless states are serious in playing an instrumental role in bringing an end to impunity in Africa, and that withdrawing from the ICC without an alternative, effective system in place, will simply perpetuate impunity.Keywords: African Union, diplomatic immunity, impunity, international criminal court, South Africa
Procedia PDF Downloads 530