Search results for: James Paul Menina
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 796

Search results for: James Paul Menina

286 A Wearable Device to Overcome Post–Stroke Learned Non-Use; The Rehabilitation Gaming System for wearables: Methodology, Design and Usability

Authors: Javier De La Torre Costa, Belen Rubio Ballester, Martina Maier, Paul F. M. J. Verschure

Abstract:

After a stroke, a great number of patients experience persistent motor impairments such as hemiparesis or weakness in one entire side of the body. As a result, the lack of use of the paretic limb might be one of the main contributors to functional loss after clinical discharge. We aim to reverse this cycle by promoting the use of the paretic limb during activities of daily living (ADLs). To do so, we describe the key components of a system that is composed of a wearable bracelet (i.e., a smartwatch) and a mobile phone, designed to bring a set of neurorehabilitation principles that promote acquisition, retention and generalization of skills to the home of the patient. A fundamental question is whether the loss in motor function derived from learned–non–use may emerge as a consequence of decision–making processes for motor optimization. Our system is based on well-established rehabilitation strategies that aim to reverse this behaviour by increasing the reward associated with action execution as well as implicitly reducing the expected cost associated with the use of the paretic limb, following the notion of the reinforcement–induced movement therapy (RIMT). Here we validate an accelerometer–based measure of arm use, and its capacity to discriminate different activities that require increasing movement of the arm. We also show how the system can act as a personalized assistant by providing specific goals and adjusting them depending on the performance of the patients. The usability and acceptance of the device as a rehabilitation tool is tested using a battery of self–reported and objective measurements obtained from acute/subacute patients and healthy controls. We believe that an extension of these technologies will allow for the deployment of unsupervised rehabilitation paradigms during and beyond the hospitalization time.

Keywords: stroke, wearables, learned non use, hemiparesis, ADLs

Procedia PDF Downloads 185
285 Use of Locally Effective Microorganisms in Conjunction with Biochar to Remediate Mine-Impacted Soils

Authors: Thomas F. Ducey, Kristin M. Trippe, James A. Ippolito, Jeffrey M. Novak, Mark G. Johnson, Gilbert C. Sigua

Abstract:

The Oronogo-Duenweg mining belt –approximately 20 square miles around the Joplin, Missouri area– is a designated United States Environmental Protection Agency Superfund site due to lead-contaminated soil and groundwater by former mining and smelting operations. Over almost a century of mining (from 1848 to the late 1960’s), an estimated ten million tons of cadmium, lead, and zinc containing material have been deposited on approximately 9,000 acres. Sites that have undergone remediation, in which the O, A, and B horizons have been removed along with the lead contamination, the exposed C horizon remains incalcitrant to revegetation efforts. These sites also suffer from poor soil microbial activity, as measured by soil extracellular enzymatic assays, though 16S ribosomal ribonucleic acid (rRNA) indicates that microbial diversity is equal to sites that have avoided mine-related contamination. Soil analysis reveals low soil organic carbon, along with high levels of bio-available zinc, that reflect the poor soil fertility conditions and low microbial activity. Our study looked at the use of several materials to restore and remediate these sites, with the goal of improving soil health. The following materials, and their purposes for incorporation into the study, were as follows: manure-based biochar for the binding of zinc and other heavy metals responsible for phytotoxicity, locally sourced biosolids and compost to incorporate organic carbon into the depleted soils, effective microorganisms harvested from nearby pristine sites to provide a stable community for nutrient cycling in the newly composited 'soil material'. Our results indicate that all four materials used in conjunction result in the greatest benefit to these mine-impacted soils, based on above ground biomass, microbial biomass, and soil enzymatic activities.

Keywords: locally effective microorganisms, biochar, remediation, reclamation

Procedia PDF Downloads 190
284 Water Re-Use Optimization in a Sugar Platform Biorefinery Using Municipal Solid Waste

Authors: Leo Paul Vaurs, Sonia Heaven, Charles Banks

Abstract:

Municipal solid waste (MSW) is a virtually unlimited source of lignocellulosic material in the form of a waste paper/cardboard mixture which can be converted into fermentable sugars via cellulolytic enzyme hydrolysis in a biorefinery. The extraction of the lignocellulosic fraction and its preparation, however, are energy and water demanding processes. The waste water generated is a rich organic liquor with a high Chemical Oxygen Demand that can be partially cleaned while generating biogas in an Upflow Anaerobic Sludge Blanket bioreactor and be further re-used in the process. In this work, an experiment was designed to determine the critical contaminant concentrations in water affecting either anaerobic digestion or enzymatic hydrolysis by simulating multiple water re-circulations. It was found that re-using more than 16.5 times the same water could decrease the hydrolysis yield by up to 65 % and led to a complete granules desegregation. Due to the complexity of the water stream, the contaminant(s) responsible for the performance decrease could not be identified but it was suspected to be caused by sodium, potassium, lipid accumulation for the anaerobic digestion (AD) process and heavy metal build-up for enzymatic hydrolysis. The experimental data were incorporated into a Water Pinch technology based model that was used to optimize the water re-utilization in the modelled system to reduce fresh water requirement and wastewater generation while ensuring all processes performed at optimal level. Multiple scenarios were modelled in which sub-process requirements were evaluated in term of importance, operational costs and impact on the CAPEX. The best compromise between water usage, AD and enzymatic hydrolysis yield was determined for each assumed contaminant degradations by anaerobic granules. Results from the model will be used to build the first MSW based biorefinery in the USA.

Keywords: anaerobic digestion, enzymatic hydrolysis, municipal solid waste, water optimization

Procedia PDF Downloads 292
283 Hypertensive Response to Maximal Exercise Test in Young and Middle Age Hypertensive on Blood Pressure Lowering Medication: Monotherapy vs. Combination Therapy

Authors: James Patrick A. Diaz, Raul E. Ramboyong

Abstract:

Background: Hypertensive response during maximal exercise test provides important information on the level of blood pressure control and evaluation of treatment. Method: A single center retrospective descriptive study was conducted among 117 young (aged 20 to 40) and middle age (aged 40 to 65) hypertensive patients, who underwent treadmill stress test. Currently on maintenance frontline medication either monotherapy (Angiotensin-converting enzyme inhibitor/Angiotensin receptor blocker [ACEi/ARB], Calcium channel blocker [CCB], Diuretic - Hydrochlorthiazide [HCTZ]) or combination therapy (ARB+CCB, ARB+HCTZ), who attained a maximal exercise on treadmill stress test (TMST) with hypertensive response (systolic blood pressure: male >210 mm Hg, female >190 mm Hg, diastolic blood pressure >100 mmHg, or increase of >10 mm Hg at any time during the test), on Bruce and Modified Bruce protocol. Exaggerated blood pressure response during exercise (systolic [SBP] and diastolic [DBP]), peak exercise blood pressure (SBP and DBP), recovery period (SBP and DBP) and test for ischemia and their antihypertensive medication/s were investigated. Analysis of variance and chi-square test were used for statistical analysis. Results: Hypertensive responses on maximal exercise test were seen mostly among female population (P < 0.000) and middle age (P < 0.000) patients. Exaggerated diastolic blood pressure responses were significantly lower in patients who were taking CCB (P < 0.004). A longer recovery period that showed a delayed decline in SBP was observed in patients taking ARB+HCTZ (P < 0.036). There were no significant differences in the level of exaggerated systolic blood pressure response and during peak exercise (both systolic and diastolic) in patients using either monotherapy or combination antihypertensives. Conclusion: Calcium channel blockers provided lower exaggerated diastolic BP response during maximal exercise test in hypertensive middle age patients. Patients on combination therapy using ARB+HCTZ exhibited a longer recovery period of systolic blood pressure.

Keywords: antihypertensive, exercise test, hypertension, hyperytensive response

Procedia PDF Downloads 253
282 The Effect of Reaction Time on the Morphology and Phase of Quaternary Ferrite Nanoparticles (FeCoCrO₄) Synthesised from a Single Source Precursor

Authors: Khadijat Olabisi Abdulwahab, Mohammad Azad Malik, Paul O'Brien, Grigore Timco, Floriana Tuna

Abstract:

The synthesis of spinel ferrite nanoparticles with a narrow size distribution is very crucial in their numerous applications including information storage, hyperthermia treatment, drug delivery, contrast agent in magnetic resonance imaging, catalysis, sensors, and environmental remediation. Ferrites have the general formula MFe₂O₄ (M = Fe, Co, Mn, Ni, Zn e.t.c) and possess remarkable electrical and magnetic properties which depend on the cations, method of preparation, size and their site occupancies. To the best of our knowledge, there are no reports on the use of a single source precursor to synthesise quaternary ferrite nanoparticles. Here in, we demonstrated the use of trimetallic iron pivalate cluster [CrCoFeO(O₂CᵗBu)₆(HO₂CᵗBu)₃] as a single source precursor to synthesise monodisperse cobalt chromium ferrite (FeCoCrO₄) nanoparticles by the hot injection thermolysis method. The precursor was thermolysed in oleylamine, oleic acid, with diphenyl ether as solvent at 260 °C. The effect of reaction time on the stoichiometry, phases or morphology of the nanoparticles was studied. The p-XRD patterns of the nanoparticles obtained after one hour was pure phase of cubic iron cobalt chromium ferrite (FeCoCrO₄). TEM showed that a more monodispersed spherical ferrite nanoparticles were obtained after one hour. Magnetic measurements revealed that the ferrite particles are superparamagnetic at room temperature. The nanoparticles were characterised by Powder X-ray Diffraction (p-XRD), Transmission Electron Microscopy (TEM), Energy Dispersive Spectroscopy (EDS) and Super Conducting Quantum Interference Device (SQUID).

Keywords: cobalt chromium ferrite, colloidal, hot injection thermolysis, monodisperse, reaction time, single source precursor, quaternary ferrite nanoparticles

Procedia PDF Downloads 280
281 Historical Analysis of the Evolution of Swiss Identity and the Successful Integration of Multilingualism into the Swiss Concept of Nationhood

Authors: James Beringer

Abstract:

Switzerland’s ability to forge a strong national identity across linguistic barriers has long been of interest to nationalism scholars. This begs the question of how this has been achieved, given that traditional explanations of luck or exceptionalism appear highly reductionist. This paper evaluates the theory that successful Swiss management of linguistic diversity stems from the strong integration of multilingualism into Swiss national identity. Using archival analysis of Swiss government records, historical accounts of prominent Swiss citizens, as well as secondary literature concerning the fundamental aspects of Swiss national identity, this paper charts the historical evolution of Swiss national identity. It explains how multilingualism was deliberately and successfully integrated into Swiss national identity as a response to political fragmentation along linguistic lines during the First World War. Its primary conclusions are the following. Firstly, the earliest foundations of Swiss national identity were purposefully removed from any association with a single national language. This produced symbols, myths, and values -such as a strong commitment to communalism, the imagery of the Swiss natural landscape, and the use of Latin expressions, which can be adopted across Swiss linguistic groups. Secondly, the First World War triggered a turning point in the evolution of Swiss national identity. The fundamental building blocks proved insufficient in preventing political fractures amongst linguistic lines, as each Swiss linguistic group gravitated towards its linguistic neighbours within Europe. To avoid a repeat of such fragmentation, a deliberate effort was made to fully integrate multilingualism as a fundamental aspect of Swiss national identity. Existing natural symbols, such as the St Gotthard Mountains, were recontextualized in order to become associated with multilingualism. The education system was similarly reformed to reflect the unique multilingual nature of the Swiss nation. The successful result of this process can be readily observed in polls and surveys, with large segments of the Swiss population highlighting multilingualism as a uniquely Swiss characteristic, indicating the symbiotic connection between multilingualism and the Swiss nation.

Keywords: language's role in identity formation, multilingualism in nationalism, national identity formation, Swiss national identity history

Procedia PDF Downloads 156
280 Symmetry of Performance across Lower Limb Tests between the Dominant and Non-Dominant Legs

Authors: Ghulam Hussain, Herrington Lee, Comfort Paul, Jones Richard

Abstract:

Background: To determine the functional limitations of the lower limbs or readiness to return to sport, most rehabilitation programs use some form of testing; however, it is still unknown what the pass criteria is. This study aims to investigate the differences between the dominant and non-dominant leg performances across several lower limb tasks, which are hop tests, two-dimensional (2D) frontal plane projection angle (FPPA) tests, and isokinetic muscle tests. This study also provides the reference values for the limb symmetry index (LSI) for the hop and isokinetic muscle strength tests. Twenty recreationally active participants were recruited, 11 males and 9 females (age 23.65±2.79 years; height 169.9±3.74 cm; and body mass 74.72±5.81 kg. All tests were undertaken on the dominant and non-dominant legs. These tests are (1) Hop tests, which include horizontal hop for distance and crossover hop tests, (2) Frontal plane projection angle (FPPA): 2D capturing from two different tasks, which are forward hop landing and squatting, and (3) Isokinetic muscle strength tests: four different muscles were tested: quadriceps, hamstring, ankle plantar flexor, and hip extensor muscles. The main outcome measurements were, for the (1) hop tests: maximum distance was taken when undertaking single/crossover hop for distance using a standard tape measure, (2) for the FPPA: the knee valgus angle was measured from the maximum knee flexion position using a single 2D camera, and (3) for the isokinetic muscle strength tests: three different variables were measured: peak torque, peak torque to body weight, and the total work to body weight. All the muscle strength tests have been applied in both concentric and eccentric muscle actions at a speed of 60°/sec. This study revealed no differences between the dominant and non-dominant leg performance, and 85% of LSI was achieved by the majority of the subjects in both hop and isokinetic muscle tests, and; therefore, one leg’s hop performance can define the other.

Keywords: 2D FPPA, hop tests, isokinetic testing, LSI

Procedia PDF Downloads 38
279 Monodisperse Quaternary Cobalt Chromium Ferrite Nanoparticles Synthesised from a Single Source Precursor

Authors: Khadijat O. Abdulwahab, Mohammad A. Malik, Paul O’Brien, Grigore A. Timco, Floriana Tuna

Abstract:

The synthesis of spinel ferrite nanoparticles with a narrow size distribution is very crucial in their numerous applications including information storage, hyperthermia treatment, drug delivery, contrast agent in magnetic resonance imaging, catalysis, sensors, and environmental remediation. Ferrites have the general formula MFe2O4 (M = Fe, Co, Mn, Ni, Zn etc.) and possess remarkable electrical and magnetic properties which depend on the cations, method of preparation, size and their site occupancies. To the best of our knowledge, there are no reports on the use of a single source precursor to synthesise quaternary ferrite nanoparticles. Herein, we demonstrated the use of trimetallic iron pivalate cluster [CrCoFeO(O2CtBu)6(HO2CtBu)3] as a single source precursor to synthesise monodisperse cobalt chromium ferrite (FeCoCrO4) nanoparticles by the hot injection thermolysis method. The precursor was thermolysed in oleylamine, oleic acid, with diphenyl ether as solvent at its boiling point (260°C). The effect of concentration on the stoichiometry, phases or morphology of the nanoparticles was studied. The p-XRD patterns of the nanoparticles obtained at both concentrations were matched with cubic iron cobalt chromium ferrite (FeCoCrO4). TEM showed that a more monodispersed spherical ferrite nanoparticles of average diameter 4.0 ± 0.4 nm were obtained at higher precursor concentration. Magnetic measurements revealed that all the ferrite particles are superparamagnetic at room temperature. The nanoparticles were characterised by Powder X-ray Diffraction (p-XRD), Transmission Electron Microscopy (TEM), Inductively Coupled Plasma (ICP), Electron Probe Microanalysis (EPMA), Energy Dispersive Spectroscopy (EDS) and Super Conducting Quantum Interference Device (SQUID).

Keywords: quaternary ferrite nanoparticles, single source precursor, monodisperse, cobalt chromium ferrite, colloidal, hot injection thermolysis

Procedia PDF Downloads 249
278 Sentiment Mapping through Social Media and Its Implications

Authors: G. C. Joshi, M. Paul, B. K. Kalita, V. Ranga, J. S. Rawat, P. S. Rawat

Abstract:

Being a habitat of the global village, every place has established connection through the strength and power of social media piercing through the political boundaries. Social media is a digital platform, where people across the world can interact as it has advantages of being universal, anonymous, easily accessible, indirect interaction, gathering and sharing information. The power of social media lies in the intensity of sharing extreme opinions or feelings, in contrast to the personal interactions which can be easily mapped in the form of Sentiment Mapping. The easy access to social networking sites such as Facebook, Twitter and blogs made unprecedented opportunities for citizens to voice their opinions loaded with dynamics of emotions. These further influence human thoughts where social media plays a very active role. A recent incident of public importance was selected as a case study to map the sentiments of people through Twitter. Understanding those dynamics through the eye of an ordinary people can be challenging. With the help of R-programming language and by the aid of GIS techniques sentiment maps has been produced. The emotions flowing worldwide in the form of tweets were extracted and analyzed. The number of tweets had diminished by 91 % from 25/08/2017 to 31/08/2017. A boom of sentiments emerged near the origin of the case, i.e., Delhi, Haryana and Punjab and the capital showed maximum influence resulting in spillover effect near Delhi. The trend of sentiments was prevailing more as neutral (45.37%), negative (28.6%) and positive (21.6%) after calculating the sentiment scores of the tweets. The result can be used to know the spatial distribution of digital penetration in India, where highest concentration lies in Mumbai and lowest in North East India and Jammu and Kashmir.

Keywords: sentiment mapping, digital literacy, GIS, R statistical language, spatio-temporal

Procedia PDF Downloads 125
277 Assessment of Students Skills in Error Detection in SQL Classes using Rubric Framework - An Empirical Study

Authors: Dirson Santos De Campos, Deller James Ferreira, Anderson Cavalcante Gonçalves, Uyara Ferreira Silva

Abstract:

Rubrics to learning research provide many evaluation criteria and expected performance standards linked to defined student activity for learning and pedagogical objectives. Despite the rubric being used in education at all levels, academic literature on rubrics as a tool to support research in SQL Education is quite rare. There is a large class of SQL queries is syntactically correct, but certainly, not all are semantically correct. Detecting and correcting errors is a recurring problem in SQL education. In this paper, we usthe Rubric Abstract Framework (RAF), which consists of steps, that allows us to map the information to measure student performance guided by didactic objectives defined by the teacher as long as it is contextualized domain modeling by rubric. An empirical study was done that demonstrates how rubrics can mitigate student difficulties in finding logical errors and easing teacher workload in SQL education. Detecting and correcting logical errors is an important skill for students. Researchers have proposed several ways to improve SQL education because understanding this paradigm skills are crucial in software engineering and computer science. The RAF instantiation was using in an empirical study developed during the COVID-19 pandemic in database course. The pandemic transformed face-to-face and remote education, without presential classes. The lab activities were conducted remotely, which hinders the teaching-learning process, in particular for this research, in verifying the evidence or statements of knowledge, skills, and abilities (KSAs) of students. Various research in academia and industry involved databases. The innovation proposed in this paper is the approach used where the results obtained when using rubrics to map logical errors in query formulation have been analyzed with gains obtained by students empirically verified. The research approach can be used in the post-pandemic period in both classroom and distance learning.

Keywords: rubric, logical error, structured query language (SQL), empirical study, SQL education

Procedia PDF Downloads 160
276 Assessment of Biotic and Abiotic Water Factors of Antiao and Jiabong Rivers for Benthic Algae

Authors: Geno Paul S. Cumla, Jan Mariel M. Gentiles, M. Brenda Gajelan-Samson

Abstract:

Eutrophication is a process where in there is a surplus of nutrients present in a lake or river. Harmful cyanobacteria, hypoxia, and primarily algae, which contain toxins, grow because of the excess nutrients. Algal blooms can cause fish kills, limiting the light penetration which reduces growth of aquatic organisms, causing die-offs of plants and produce conditions that are dangerous to aquatic and human life. The main cause for eutrophication is the presence of excessive amounts of phosphorus (P) and nitrogen (N). Nitrogen is necessary for the production of the plant tissues and is usually used to synthesize proteins. Nitrate is a compound that contains nitrogen, and at elevated levels it can cause harmful effects. Excessive amounts of phosphorus, displaced through human activity, is the major cause of algae growth and as well as degraded water quality. To accomplish this study the Assessment of Soluble inorganic nitrogen (SIN), Assessment of Soluble reactive phosphate (SRP), Determination of Chlorophyll a (Chl-a) concentration, and Determination of Dominating Taxa were done. The study addresses the high probability of algal blooms in Maqueda Bay by assessing the biotic and abiotic factors of Antiao and Jiabong rivers. The data predicts the overgrowth of algae and to create awareness to prevent the event from taking place. The study assesses the adverse effects that could be prevented by understanding and controlling algae. This should predict future cases of algal blooms and allow government agencies which require data to create programs to prevent and assess these issues.

Keywords: eutrophication, chlorophyll a, nitrogen, phosphorus, red tide, Kjeldahl method, spectrophotometer, assessment of soluble inorganic nitrogen, SIN, assessment of soluble reactive phosphate, SRP

Procedia PDF Downloads 116
275 Investigating Elements That Influence Higher Education Institutions’ Digital Maturity

Authors: Zarah M. Bello, Nathan Baddoo, Mariana Lilley, Paul Wernick

Abstract:

In this paper, we present findings from a multi-part study to evaluate candidate elements reflecting the level of digital capability maturity (DCM) in higher education and the relationship between these elements. We will use these findings to propose a model of DCM for educational institutions. We suggest that the success of learning in higher education is dependent in part on the level of maturity of digital capabilities of institutions as well as the abilities of learners and those who support the learning process. It is therefore important to have a good understanding of the elements that underpin this maturity as well as their impact and interactions in order to better exploit the benefits that technology presents to the modern learning environment and support its continued improvement. Having identified ten candidate elements of digital capability that we believe support the level of a University’s maturity in this area as well as a number of relevant stakeholder roles, we conducted two studies utilizing both quantitative and qualitative research methods. In the first of these studies, 85 electronic questionnaires were completed by various stakeholders in a UK university, with a 100% response rate. We also undertook five in-depth interviews with management stakeholders in the same university. We then utilized statistical analysis to process the survey data and conducted a textual analysis of the interview transcripts. Our findings support our initial identification of candidate elements and support our contention that these elements interact in a multidimensional manner. This multidimensional dynamic suggests that any proposal for improvement in digital capability must reflect the interdependency and cross-sectional relationship of the elements that contribute to DCM. Our results also indicate that the notion of DCM is strongly data-centric and that any proposed maturity model must reflect the role of data in driving maturity and improvement. We present these findings as a key step towards the design of an operationalisable DCM maturity model for universities.

Keywords: digital capability, elements, maturity, maturity framework, university

Procedia PDF Downloads 118
274 Speech Anxiety in Higher Education Students-Retention of an Ancestral Trait: A Study into the Students' Perspective of Communication Anxiety with Suggestions on How to Minimise Student Distress

Authors: Paul D. Facey, Claire Morgan

Abstract:

Speech anxiety is thought to be deep-seated within the human evolutionary lineage.As a result, almost all people display high levels of anxiety when asked to communicate in front of an audience.However, proficiency in oral communication is considered as an essential skill for a graduate career and significant emphasis is placed on developing these skills in many degree programs.Because of this, many degree schemes incorporate some form of assessed dialogic presentation. Yet, a student’s anxiety over public speaking, especially if severe, can be so great that at worst it can cause the student to withdraw from their study. This study investigated how students perceive their own levels of anxiety when faced with public speaking using the Personal Report of Public Speaking Anxiety (PRPSA) questionnaire developed by McCroskey. Additionally, students were asked to provide examples of adjustments that could be implemented that they felt would alleviate some/all of their anxiety. The results of the study indicated that the majority of the students experienced a moderate level of anxiety. However, further analysis showed that of those who were in the moderate anxiety’ group, 43% fell into the higher range suggesting that overall more students experience higher levels of anxiety when faced with public speaking than maybe first envisaged. Thus, it is essential that steps are taken to address student anxiety in order that students engage with presentations, are motivated and encouraged and do not avoid such assignments. The feedback from our students indicated a need to implement systematic desensitization programs where students learn to overcome their anxiety through a series of sessions that gradually increase their anxiety levels. Furthermore, these sessions should be run in parallel with skills sessions in order for students to be better prepared and allow self-reflection and self-analysis.This study highlights the paucity of these sessions on many degree schemes and suggests that they should form an integral part of a students’ early academic learning.

Keywords: student anxiety, communication anxiety, public speaking, higher education, desensitisation

Procedia PDF Downloads 216
273 Synthesis and Two-Photon Polymerization of a Cytocompatibility Tyramine Functionalized Hyaluronic Acid Hydrogel That Mimics the Chemical, Mechanical, and Structural Characteristics of Spinal Cord Tissue

Authors: James Britton, Vijaya Krishna, Manus Biggs, Abhay Pandit

Abstract:

Regeneration of the spinal cord after injury remains a great challenge due to the complexity of this organ. Inflammation and gliosis at the injury site hinder the outgrowth of axons and hence prevent synaptic reconnection and reinnervation. Hyaluronic acid (HA) is the main component of the spinal cord extracellular matrix and plays a vital role in cell proliferation and axonal guidance. In this study, we have synthesized and characterized a photo-cross-linkable HA-tyramine (tyr) hydrogel from a chemical, mechanical, electrical, biological and structural perspective. From our experimentation, we have found that HA-tyr can be synthesized with controllable degrees of tyramine substitution using click chemistry. The complex modulus (G*) of HA-tyr can be tuned to mimic the mechanical properties of the native spinal cord via optimization of the photo-initiator concentration and UV exposure. We have examined the degree of tyramine-tyramine covalent bonding (polymerization) as a function of UV exposure and photo-initiator use via Photo and Nuclear magnetic resonance spectroscopy. Both swelling and enzymatic degradation assays were conducted to examine the resilience of our 3D printed hydrogel constructs in-vitro. Using a femtosecond 780nm laser, the two-photon polymerization of HA-tyr hydrogel in the presence of riboflavin photoinitiator was optimized. A laser power of 50mW and scan speed of 30,000 μm/s produced high-resolution spatial patterning within the hydrogel with sustained mechanical integrity. Using dorsal root ganglion explants, the cytocompatibility of photo-crosslinked HA-tyr was assessed. Using potentiometry, the electrical conductivity of photo-crosslinked HA-tyr was assessed and compared to that of native spinal cord tissue as a function of frequency. In conclusion, we have developed a biocompatible hydrogel that can be used for photolithographic 3D printing to fabricate tissue engineered constructs for neural tissue regeneration applications.

Keywords: 3D printing, hyaluronic acid, photolithography, spinal cord injury

Procedia PDF Downloads 132
272 Screening of the Genes FOLH1 and MTHFR among the Mothers of Congenital Neural Tube Defected Babies in West Bengal, India

Authors: Silpita Paul, Susanta Sadhukhan, Biswanath Maity, Madhusudan Das

Abstract:

Neural tube defects (NTDs) are one of the most common forms of birth defect and affect ~300,000 new born worldwide each year. The prevalence is higher in Northern India (11 per 1000 birth) compare to southern India (5 per 1000 birth). NTDs are one of the common birth defects related with low blood folate and Hcy concentration. Though the mechanism is still unknown, but it is now established that, NTDs in human are polygenic in nature and follow the heterogeneous trait. In spite of its heterogeneity, polymorphism in few genes affects significantly the trait of NTDs. Polymorphisms in the genes FOLH1 and MTHFR plays important role in NTDs. In this study, the polymorphisms of these genes were screened by bi-directional sequencing from 30 mothers with NTD babies as case. The result revealed that 26.67% patients had bi-allelic FOLH1 polymorphism. The polymorphism has been identified as p.Y60H and frequent to cause NTDs. The study of MTHFR gene showed 2 different SNPs rs1801131 (at exon 4) and rs1801131 (at exon 7). The study showed 6.67% patients of both mono- and bi-allelic MTHFR-rs1801131 polymorphism and 6.67% patients of bi-allelic MTHFR-rs1801131 polymorphism. These polymorphisms has been responsible for p.A222V and p.E429A change respectively and frequently involved in NTD formation. Those polymorphisms affect mainly the absorption of dietary folate from intestine and the formation of 5-methylenetetrahydrofolate (5 MTHF) from 5,10-methylenetetrahydrofolate (5,10- MTHF), which is the functional folate form in our system. Though the study is not complete yet, but these polymorphisms play crucial roles in the formation of NTDs in other world population. Based on the result till date, it can be concluded that they also play significant role in our population too as in control samples we have not found any changes.

Keywords: neural tube defects, polymorphism, FOLH1, MTHFR

Procedia PDF Downloads 285
271 Four-Electron Auger Process for Hollow Ions

Authors: Shahin A. Abdel-Naby, James P. Colgan, Michael S. Pindzola

Abstract:

A time-dependent close-coupling method is developed to calculate a total, double and triple autoionization rates for hollow atomic ions of four-electron systems. This work was motivated by recent observations of the four-electron Auger process in near K-edge photoionization of C+ ions. The time-dependent close-coupled equations are solved using lattice techniques to obtain a discrete representation of radial wave functions and all operators on a four-dimensional grid with uniform spacing. Initial excited states are obtained by relaxation of the Schrodinger equation in imaginary time using a Schmidt orthogonalization method involving interior subshells. The radial wave function grids are partitioned over the cores on a massively parallel computer, which is essential due to the large memory requirements needed to store the coupled-wave functions and the long run times needed to reach the convergence of the ionization process. Total, double, and triple autoionization rates are obtained by the propagation of the time-dependent close-coupled equations in real-time using integration over bound and continuum single-particle states. These states are generated by matrix diagonalization of one-electron Hamiltonians. The total autoionization rates for each L excited state is found to be slightly above the single autoionization rate for the excited configuration using configuration-average distorted-wave theory. As expected, we find the double and triple autoionization rates to be much smaller than the total autoionization rates. Future work can be extended to study electron-impact triple ionization of atoms or ions. The work was supported in part by grants from the American University of Sharjah and the US Department of Energy. Computational work was carried out at the National Energy Research Scientific Computing Center (NERSC) in Berkeley, California, USA.

Keywords: hollow atoms, autoionization, auger rates, time-dependent close-coupling method

Procedia PDF Downloads 131
270 Organisational Culture and the Role of the Mental Health Nurse: An Ethnography of the New Graduate Nurse Experience

Authors: Mary-Ellen Hooper, Graeme Browne, Anthony Paul O'Brien

Abstract:

Background: It has been reported that the experience of the organisational workplace culture for new graduate mental health nurses plays an important role in their attraction and retention to the discipline. Additionally, other research indicates that a negative workplace culture contributes to their dissatisfaction and attrition rate. Method: An ethnographic research design was applied to explore the subcultural experiences of new graduate nurses as they encounter mental health nursing. Data was collected between April and September 2017 across 6 separate Australian, NSW, mental health units. Data comprised of semi-structured interviews (n=24) and 31 episodes of field observation (62 hours). A total number of 26 new graduate and recent graduate nurses participated in the study – 14 new graduate nurses and 12 recently graduated nurses. Results: A key finding from this study was the New Graduate difficulty in articulating the role the of mental health nurse. Participants described a dichotomy between their ideological view of the mental health nurse and the reality of clinical practice. The participants’ ideological view of the mental health nurse involved providing holistic and individualised care within a flexible framework. Participants, however, described feeling powerless to change the recovery practices within the mental health service(s) because of their low status within the hierarchy. Resulting in participants choosing to fit into the existing culture, or considering leaving the field altogether. Conclusion: An incongruence between the values and ideals of an organisational culture and the reality shock of practice are shown to contribute to role ambiguity within its members. New graduate nurses entering the culture of mental health nursing describe role ambiguity resulting in dissatisfaction with practice. The culture and philosophy inherent to a service are posited to be crucial in creating positive experiences for graduate nurses.

Keywords: culture, mental health nurse, mental health nursing role, new graduate nurse

Procedia PDF Downloads 131
269 A Systematic Review of Pedometer-or Accelerometer-Based Interventions for Increasing Physical Activity in Low Socioeconomic Groups

Authors: Shaun G. Abbott, Rebecca C. Reynolds, James B. Etter, John B. F. de Wit

Abstract:

The benefits of physical activity (PA) on health are well documented. Low socioeconomic status (SES) is associated with poor health, with PA a suggested mediator. Pedometers and accelerometers offer an effective behavior change tool to increase PA levels. While the role of pedometer and accelerometer use in increasing PA is recognized in many populations, little is known in low-SES groups. We are aiming to assess the effectiveness of pedometer- and accelerometer-based interventions for increasing PA step count and improving subsequent health outcomes among low-SES groups of high-income countries. Medline, Embase, PsycINFO, CENTRAL and SportDiscus databases were searched to identify articles published before 10th July, 2015; using search terms developed from previous systematic reviews. Inclusion criteria are: low-SES participants classified by income, geography, education, occupation or ethnicity; study duration minimum 4 weeks; an intervention and control group; wearing of an unsealed pedometer or accelerometer to objectively measure PA as step counts per day for the duration of the study. We retrieved 2,142 articles from our database searches, after removal of duplicates. Two investigators independently reviewed titles and abstracts of these articles (50% each) and a combined 20% sample were reviewed to account for inter-assessor variation. We are currently verifying the full texts of 430 articles. Included studies will be critically appraised for risk of bias using guidelines suggested by the Cochrane Public Health Group. Two investigators will extract data concerning the intervention; study design; comparators; steps per day; participants; context and presence or absence of obesity and/or chronic disease. Heterogeneity amongst studies is anticipated, thus a narrative synthesis of data will be conducted with the simplification of selected results into percentage increases from baseline to allow for between-study comparison. Results will be presented at the conference in December if selected.

Keywords: accelerometer, pedometer, physical activity, socioeconomic, step count

Procedia PDF Downloads 305
268 Agent-Based Modelling to Improve Dairy-origin Beef Production: Model Description and Evaluation

Authors: Addisu H. Addis, Hugh T. Blair, Paul R. Kenyon, Stephen T. Morris, Nicola M. Schreurs, Dorian J. Garrick

Abstract:

Agent-based modeling (ABM) enables an in silico representation of complex systems and cap-tures agent behavior resulting from interaction with other agents and their environment. This study developed an ABM to represent a pasture-based beef cattle finishing systems in New Zea-land (NZ) using attributes of the rearer, finisher, and processor, as well as specific attributes of dairy-origin beef cattle. The model was parameterized using values representing 1% of NZ dairy-origin cattle, and 10% of rearers and finishers in NZ. The cattle agent consisted of 32% Holstein-Friesian, 50% Holstein-Friesian–Jersey crossbred, and 8% Jersey, with the remainder being other breeds. Rearers and finishers repetitively and simultaneously interacted to determine the type and number of cattle populating the finishing system. Rearers brought in four-day-old spring-born calves and reared them until 60 calves (representing a full truck load) on average had a live weight of 100 kg before selling them on to finishers. Finishers mainly attained weaners from rearers, or directly from dairy farmers when weaner demand was higher than the supply from rearers. Fast-growing cattle were sent for slaughter before the second winter, and the re-mainder were sent before their third winter. The model finished a higher number of bulls than heifers and steers, although it was 4% lower than the industry reported value. Holstein-Friesian and Holstein-Friesian–Jersey-crossbred cattle dominated the dairy-origin beef finishing system. Jersey cattle account for less than 5% of total processed beef cattle. Further studies to include re-tailer and consumer perspectives and other decision alternatives for finishing farms would im-prove the applicability of the model for decision-making processes.

Keywords: agent-based modelling, dairy cattle, beef finishing, rearers, finishers

Procedia PDF Downloads 65
267 The Needs of People with a Diagnosis of Dementia and Their Carers and Families

Authors: James Boag

Abstract:

The needs of people with a diagnosis of dementia and their carers and families are physical, psychosocial, and psychological and begin at the time of diagnosis. There is frequently a lack of emotional support and counselling. Care- giving support is required from the presentation of the first symptoms of dementia until death. Alzheimer's disease begins decades before the clinical symptoms begin to appear, and in many cases, it remains undiagnosed, or diagnosed too late for any possible interventions to have any effect. However, if an incorrect diagnosis is given, it may result in a person being treated, without effect, for a type of dementia they do not have and delaying the interventions they should have received. Being diagnosed with dementia can cause emotional distress to the person, and physical and emotional support is needed, which will become more important as the disease progresses. The severity of the patient's dementia and their symptoms has a bearing of the impact on the carer and the support needed. A lack of insight and /or a denial of the diagnosis, grief, reacting to anticipated future losses, and coping methods to maximise the disease outcome, are things that should be addressed. Because of the stigma, it is important for carers not to lose contact with family and others because social isolation leads to depression and burnout. The impact on a carer's well- being and quality of life can be influenced by the severity of the illness, its type of dementia, its symptoms, healthcare support, financial and social status, career, age, health, residential setting, and relationship to the patient. Carer burnout due to lack of support leads to people diagnosed with dementia being put into residential care prematurely. Often dementia is not recognised as a terminal illness, limiting the ability of the person diagnosed with dementia and their carers to work on advance care planning and getting access to palliative and other support. Many carers have been satisfied with the physical support they were given in their everyday life, however, it was agreed that there was an immense unmet need for psychosocial support, especially after diagnosis and approaching end of life. Providing continuity and coordination of care is important. Training is necessary for providers to understand that every case is different, and they should understand the complexities. Grief, the emotional response to loss, is suffered during the progression of the disease and long afterwards, and carers should continue to be supported after the death of the person they were caring for.

Keywords: dementia, caring, challenges, needs

Procedia PDF Downloads 65
266 Development of a Humanized Anti-CEA Antibody for the Near Infrared Optical Imaging of Cancer

Authors: Paul J Yazaki, Michael Bouvet, John Shively

Abstract:

Surgery for solid gastrointestinal (GI) cancers such as pancreatic, colorectal, and gastric adenocarcinoma remains the mainstay of curative therapy. Complete resection of the primary tumor with negative margins (R0 resection), its draining lymph nodes, and distant metastases offers the optimal surgical benefit. Real-time fluorescence guided surgery (FGS) promises to improve GI cancer outcomes and is rapidly advancing with tumor-specific antibody conjugated fluorophores that can be imaged using near infrared (NIR) technology. Carcinoembryonic Antigen (CEA) is a non-internalizing tumor antigen validated as a surface tumor marker expressed in >95% of colorectal, 80% of gastric, and 60% of pancreatic adenocarcinomas. Our humanized anti-CEA hT84.66-M5A (M5A) monoclonal antibody (mAb)was conjugated with the NHS-IRDye800CW fluorophore and shown it can rapidly and effectively NIRoptical imageorthotopically implanted human colon and pancreatic cancer in mouse models. A limitation observed is that these NIR-800 dye conjugated mAbs have a rapid clearance from the blood, leading to a narrow timeframe for FGS and requiring high doses for effective optical imaging. We developed a novel antibody-fluorophore conjugate by incorporating a PEGylated sidearm linker to shield or mask the IR800 dye’s hydrophobicity which effectively extended the agent’s blood circulation half-life leading to increased tumor sensitivity and lowered normal hepatic uptake. We hypothesized that our unique anti-CEA linked to the fluorophore, IR800 by PEGylated sidewinder, M5A-SW-IR800 will become the next generation optical imaging agent, safe, effective, and widely applicable for intraoperative image guided surgery in CEA expressing GI cancers.

Keywords: optical imaging, anti-CEA, cancer, fluorescence-guided surgery

Procedia PDF Downloads 119
265 The MoEDAL-MAPP* Experiment - Expanding the Discovery Horizon of the Large Hadron Collider

Authors: James Pinfold

Abstract:

The MoEDAL (Monopole and Exotics Detector at the LHC) experiment deployed at IP8 on the Large Hadron Collider ring was the first dedicated search experiment to take data at the Large Hadron Collider (LHC) in 2010. It was designed to search for Highly Ionizing Particle (HIP) avatars of new physics such as magnetic monopoles, dyons, Q-balls, multiply charged particles, massive, slowly moving charged particles and long-lived massive charge SUSY particles. We shall report on our search at LHC’s Run-2 for Magnetic monopoles and dyons produced in p-p and photon-fusion. In more detail, we will report our most recent result in this arena: the search for magnetic monopoles via the Schwinger Mechanism in Pb-Pb collisions. The MoEDAL detector, originally the first dedicated search detector at the LHC, is being reinstalled for LHC’s Run-3 to continue the search for electrically and magnetically charged HIPs with enhanced instantaneous luminosity, detector efficiency and a factor of ten lower thresholds for HIPs. As part of this effort, we will search for massive l long-lived, singly and multiply charged particles from various scenarios for which MoEDAL has a competitive sensitivity. An upgrade to MoEDAL, the MoEDAL Apparatus for Penetrating Particles (MAPP), is now the LHC’s newest detector. The MAPP detector, positioned in UA83, expands the physics reach of MoEDAL to include sensitivity to feebly-charged particles with charge, or effective charge, as low as 10-3 e (where e is the electron charge). Also, In conjunction with MoEDAL’s trapping detector, the MAPP detector gives us a unique sensitivity to extremely long-lived charged particles. MAPP also has some sensitivity to long-lived neutral particles. The addition of an Outrigger detector for MAPP-1 to increase its acceptance for more massive milli-charged particles is currently in the Technical Proposal stage. Additionally, we will briefly report on the plans for the MAPP-2 upgrade to the MoEDAL-MAPP experiment for the High Luminosity LHC (HL-LHC). This experiment phase is designed to maximize MoEDAL-MAPP’s sensitivity to very long-lived neutral messengers of physics beyond the Standard Model. We envisage this detector being deployed in the UGC1 gallery near IP8.

Keywords: LHC, beyond the standard model, dedicated search experiment, highly ionizing particles, long-lived particles, milli-charged particles

Procedia PDF Downloads 45
264 Harnessing Environmental DNA to Assess the Environmental Sustainability of Commercial Shellfish Aquaculture in the Pacific Northwest United States

Authors: James Kralj

Abstract:

Commercial shellfish aquaculture makes significant contributions to the economy and culture of the Pacific Northwest United States. The industry faces intense pressure to minimize environmental impacts as a result of Federal policies like the Magnuson-Stevens Fisheries Conservation and Management Act and the Endangered Species Act. These policies demand the protection of essential fish habitat and declare several salmon species as endangered. Consequently, numerous projects related to the protection and rehabilitation of eelgrass beds, a crucial ecosystem for countless fish species, have been proposed at both state and federal levels. Both eelgrass beds and commercial shellfish farms occupy the same physical space, and therefore understanding the effects of shellfish aquaculture on eelgrass ecosystems has become a top ecological and economic priority of both government and industry. This study evaluates the organismal communities that eelgrass and oyster aquaculture habitats support. Water samples were collected from Willapa Bay, Washington; Tillamook Bay, Oregon; Humboldt Bay, California; and Sammish Bay, Washington to compare species diversity in eelgrass beds, oyster aquaculture plots, and boundary edges between these two habitats. Diversity was assessed using a novel technique: environmental DNA (eDNA). All organisms constantly shed small pieces of DNA into their surrounding environment through the loss of skin, hair, tissues, and waste. In the marine environment, this DNA becomes suspended in the water column allowing it to be easily collected. Once extracted and sequenced, this eDNA can be used to paint a picture of all the organisms that live in a particular habitat making it a powerful technology for environmental monitoring. Industry professionals and government officials should consider these findings to better inform future policies regulating eelgrass beds and oyster aquaculture. Furthermore, the information collected in this study may be used to improve the environmental sustainability of commercial shellfish aquaculture while simultaneously enhancing its growth and profitability in the face of ever-changing political and ecological landscapes.

Keywords: aquaculture, environmental DNA, shellfish, sustainability

Procedia PDF Downloads 226
263 Laboratory-Based Monitoring of Hepatitis B Virus Vaccination Status in North Central Nigeria

Authors: Nwadioha Samuel Iheanacho, Abah Paul, Odimayo Simidele Michael

Abstract:

Background: The World Health Assembly through the Global Health Sector Strategy on viral hepatitis calls for the elimination of viral hepatitis as a public health threat by 2030. All hands are on deck to actualize this goal through an effective and active vaccination and monitoring tool. Aim: To combine the Epidemiologic with Laboratory Hepatitis B Virus vaccination monitoring tools. Method: Laboratory results analysis of subjects recruited during the World Hepatitis week from July 2020 to July 2021 was done after obtaining their epidemiologic data on Hepatitis B virus risk factors, in the Medical Microbiology Laboratory of Benue State University Teaching Hospital, Nigeria. Result: A total of 500 subjects comprising males 60.0%(n=300/500) and females 40.0%(n=200/500) were recruited. A fifty-three percent majority was of the age range of 26 to 36 years. Serologic profiles were as follows, 15.0%(n=75/500) HBsAg; 7.0% (n=35/500) HBeAg; 8.0% (n=40/500) Anti-Hbe; 20.0% (n=100/500) Anti-HBc and 38.0% (n=190/500) Anti-HBs. Immune responses to vaccination were as follows, 47.0%(n=235/500) Immune naïve {no serologic marker + normal ALT}; 33%(n=165/500) Immunity by vaccination {Anti-HBs + normal ALT}; 5%(n=25/500) Immunity to previous infection {Anti-HBs, Anti-HBc, +/- Anti-HBe + normal ALT}; 8%(n=40/500) Carriers {HBsAg, Anti-HBc, Anti-HBe +normal ALT} and 7% (35/500) Anti-HBe serum- negative infections {HBsAg, HBeAg, Anti-HBc +elevated ALT}. Conclusion: The present 33.0% immunity by vaccination coverage in Central Nigeria was much lower than the 41.0% national peak in 2013, and a far cry from the global expectation of attainment of a Global Health Sector Strategy on the elimination of viral hepatitis as a public health threat by 2030. Therefore, more creative ideas and collective effort are needed to attain this goal of the World Health Assembly.

Keywords: Hepatitis B, vaccination status, laboratory tools, resource-limited settings

Procedia PDF Downloads 47
262 Theoretical Study of Substitutional Phosphorus and Nitrogen Pairs in Diamond

Authors: Tahani Amutairi, Paul May, Neil Allan

Abstract:

Many properties of semiconductor materials (mechanical, electronic, magnetic, and optical) can be significantly modified by introducing a point defect. Diamond offers extraordinary properties as a semiconductor, and doping seems to be a viable method of solving the problem associated with the fabrication of diamond-based electronic devices in order to exploit those properties. The dopants are believed to play a significant role in reducing the energy barrier to conduction and controlling the mobility of the carriers and the resistivity of the film. Although it has been proven that the n-type diamond semiconductor can be obtained with phosphorus doping, the resulting ionisation energy and mobility are still inadequate for practical application. Theoretical studies have revealed that this is partly because the effects of the many phosphorus atoms incorporated in the diamond lattice are compensated by acceptor states. Using spin-polarised hybrid density functional theory and a supercell approach, we explored the effects of bonding one N atom to a P in adjacent substitutional sites in diamond. A range of hybrid functional, including HSE06, B3LYP, PBE0, PBEsol0, and PBE0-13, were used to calculate the formation, binding, and ionisation energies, in order to explore the solubility and stability of the point defect. The equilibrium geometry and the magnetic and electronic structures were analysed and presented in detail. The defect introduces a unique reconstruction in a diamond where one of the C atoms coordinated with the N atom involved in the elongated C-N bond and creates a new bond with the P atom. The simulated infrared spectra of phosphorus-nitrogen defects were investigated with different supercell sizes and found to contain two sharp peaks at the edges of the spectrum, one at a high frequency 1,379 cm⁻¹ and the second appearing at the end range, 234 cm⁻¹, as obtained with the largest supercell (216).

Keywords: DFT, HSE06, B3LYP, PBE0, PBEsol0, PBE0-13

Procedia PDF Downloads 45
261 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 137
260 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 126
259 Sociological Enquiry into Occupational Risks and Its Consequences among Informal Automobile Artisans in Osun State, Nigeria

Authors: Funmilayo Juliana Afolabi, Joke Haafkens, Paul De Beer

Abstract:

Globally, there is a growing concern on reducing workplace accidents in the informal sector. However, there is a dearth of study on the perception of the informal workers on occupational risks they are exposed to. The way a worker perceives the workplace risk will influence his/her risk tolerance and risk behavior. The aim of this paper, therefore, is to have an in-depth understanding of the way the artisans perceive the risks at their workplace and how it influences their risk tolerance and risk behavior. This will help in designing meaningful intervention for the artisans and it will assist the policy makers in formulating a policy that will help them. Methods: Forty-three artisans were purposely selected for the study; data were generated through observation of the workplace and work practices of the artisans and in-depth interview from automobile artisans (Panel beater, Mechanic, Vulcanizer, and Painters) in Osun State, Nigeria. The transcriptions were coded and analyzed using MAXQDA software. Results: The perceived occupational risks among the study groups are a danger of being run over by oncoming vehicles while working by the roadside, a risk of vehicle falling on workers while working under the vehicle, cuts, and burns, fire explosion, falls from height and injuries from bursting of tires. The identified risk factors are carelessness of the workers, pressure from customers, inadequate tools, preternatural forces, God’s will and lack of apprentices that will assist them in the workplace. Furthermore, the study revealed that artisans engage in risky behavior like siphoning fuel with mouth because of perception that fuel is good for expelling worms and will make them free from any stomach upset. Conclusions: The study concluded that risky behaviors are influenced by culture, beliefs, and perception of the artisans. The study, therefore, suggested proper health and safety education for the artisans.

Keywords: automobile artisans, informal, occupational risks, Nigeria, sociological enquiry

Procedia PDF Downloads 162
258 Development of Innovative Nuclear Fuel Pellets Using Additive Manufacturing

Authors: Paul Lemarignier, Olivier Fiquet, Vincent Pateloup

Abstract:

In line with the strong desire of nuclear energy players to have ever more effective products in terms of safety, research programs on E-ATF (Enhanced-Accident Tolerant Fuels) that are more resilient, particularly to the loss of coolant, have been launched in all countries with nuclear power plants. Among the multitude of solutions being developed internationally, carcinoembryonic antigen (CEA) and its partners are investigating a promising solution, which is the realization of CERMET (CERamic-METal) type fuel pellets made of a matrix of fissile material, uranium dioxide UO2, which has a low thermal conductivity, and a metallic phase with a high thermal conductivity to improve heat evacuation. Work has focused on the development by powder metallurgy of micro-structured CERMETs, characterized by networks of metallic phase embedded in the UO₂ matrix. Other types of macro-structured CERMETs, based on concepts proposed by thermal simulation studies, have been developed with a metallic phase with a specific geometry to optimize heat evacuation. This solution could not be developed using traditional processes, so additive manufacturing, which revolutionizes traditional design principles, is used to produce these innovative prototype concepts. At CEA Cadarache, work is first carried out on a non-radioactive surrogate material, alumina, in order to acquire skills and to develop the equipment, in particular the robocasting machine, an additive manufacturing technique selected for its simplicity and the possibility of optimizing the paste formulations. A manufacturing chain was set up, with the pastes production, the 3D printing of pellets, and the associated thermal post-treatment. The work leading to the first elaborations of macro-structured alumina/molybdenum CERMETs will be presented. This work was carried out with the support of Framatome and EdF.

Keywords: additive manufacturing, alumina, CERMET, molybdenum, nuclear safety

Procedia PDF Downloads 53
257 Epstein, Barr Virus Alters ATM-Dependent DNA Damage Responses in Germinal Centre B-Cells during Early Infection

Authors: Esther N. Maina, Anna Skowronska, Sridhar Chaganti, Malcolm A. Taylor, Paul G. Murray, Tatjana Stankovic

Abstract:

Epstein-Barr virus (EBV) has been implicated in the pathogenesis of human tumours of B-cell origin. The demonstration that a proportion of Hodgkin lymphomas and all Burkitt’s lymphomas harbour EBV suggests that the virus contributes to the development of these malignancies. However, the mechanisms of lymphomagenesis remain largely unknown. To determine whether EBV causes DNA damage and alters DNA damage response in cells of EBV-driven lymphoma origin, Germinal Centre (GC) B cells were infected with EBV and DNA damage responses to gamma ionising radiation (IR) assessed at early time points (12hr – 72hr) after infection and prior to establishment of lymphoblastoid (LCL) cell lines. In the presence of EBV, we observed induction of spontaneous DNA DSBs and downregulation of ATM-dependent phosphorylation in response to IR. This downregulation coincided with reduced ability of infected cells to repair IR-induced DNA double-strand breaks, as measured by the kinetics of gamma H2AX, a marker of double-strand breaks, and by the tail moment of the comet assay. Furthermore, we found that alteration of DNA damage responses coincided with the expression of LMP-1 protein. The presence of the EBV virus did not affect the localization of the ATM-dependent DNA repair proteins to sites of damage but instead lead to an increased expression of PP5, a phosphatase that regulates ATM function. The impact of the virus on DNA repair was most prominent 24h after infection, suggesting that this time point is crucial for the viral establishment in B cells. Our results suggest that during an early infection EBV virus dampens crucial cellular responses to DNA double-strand breaks which facilitate successful viral infection, but at the same time might provide the mechanism for tumor development.

Keywords: EBV, ATM, DNA damage, germinal center cells

Procedia PDF Downloads 324