Search results for: specific methane production
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14522

Search results for: specific methane production

2942 Mitigating Self-Regulation Issues in the Online Instruction of Math

Authors: Robert Vanderburg, Michael Cowling, Nicholas Gibson

Abstract:

Mathematics is one of the core subjects taught in the Australian K-12 education system and is considered an important component for future studies in areas such as engineering and technology. In addition to this, Australia has been a world leader in distance education due to the vastness of its geographic landscape. Despite this, research is still needed on distance math instruction. Even though delivery of curriculum has given way to online studies, and there is a resultant push for computer-based (PC, tablet, smartphone) math instruction, much instruction still involves practice problems similar to those original curriculum packs, without the ability for students to self-regulate their learning using the full interactive capabilities of these devices. Given this need, this paper addresses issues students have during online instruction. This study consists of 32 students struggling with mathematics enrolled in a math tutorial conducted in an online setting. The study used a case study design to understand some of the blockades hindering the students’ success. Data was collected by tracking students practice and quizzes, tracking engagement of the site, recording one-on-one tutorials, and collecting data from interviews with the students. Results revealed that when students have cognitively straining tasks in an online instructional setting, the first thing to dissipate was their ability to self-regulate. The results also revealed that instructors could ameliorate the situation and provided useful data on strategies that could be used for designing future online tasks. Specifically, instructors could utilize cognitive dissonance strategies to reduce the cognitive drain of the tasks online. They could segment the instruction process to reduce the cognitive demands of the tasks and provide in-depth self-regulatory training, freeing mental capacity for the mathematics content. Finally, instructors could provide specific scheduling and assignment structure changes to reduce the amount of student centered self-regulatory tasks in the class. These findings will be discussed in more detail and summarized in a framework that can be used for future work.

Keywords: digital education, distance education, mathematics education, self-regulation

Procedia PDF Downloads 136
2941 Comparison of Zinc Amino Acid Complex and Zinc Sulfate in Diet for Asian Seabass (Lates calcarifer)

Authors: Kanokwan Sansuwan, Orapint Jintasataporn, Srinoy Chumkam

Abstract:

Asian seabass is one of the economically important fish of Thailand and other countries in the Southeast Asia. Zinc is an essential trace metal to fish and vital to various biological processes and function. It is required for normal growth and indispensable in the diet. Therefore, the artificial diets offered to intensively cultivated fish must possess the zinc content required by the animal metabolism for health maintenance and high weight gain rates. However, essential elements must also be in an available form to be utilized by the organism. Thus, this study was designed to evaluate the application of different zinc forms, including organic Zinc (zinc amino acid complex) and inorganic Zinc (zinc sulfate), as feed additives in diets for Asian seabass. Three groups with five replicates of fish (mean weight 22.54 ± 0.80 g) were given a basal diet either unsupplemented (control) or supplemented with 50 mg Zn kg−¹ sulfate (ZnSO₄) or Zinc Amino Acid Complex (ZnAA) respectively. Feeding regimen was initially set at 3% of body weight per day, and then the feed amount was adjusted weekly according to the actual feeding performance. The experiment was conducted for 10 weeks. Fish supplemented with ZnAA had the highest values in all studied growth indicators (weight gain, average daily growth and specific growth rate), followed by fish fed the diets with the ZnSO₄, and lowest in fish fed the diets with the control. Lysozyme and superoxide dismutase (SOD) activity of fish supplemented with ZnAA were significantly higher compared with all other groups (P < 0.05). Fish supplemented with ZnSO₄ exhibited significant increase in digestive enzyme activities (protease, pepsin and trypsin) compared with ZnAA and the control (P < 0.05). However, no significant differences were observed for RNA and protein in muscle (P > 0.05). The results of the present work suggest that ZnAA are a better source of trace elements for Asian seabass, based on growth performance and immunity indices examined in this study.

Keywords: Asian seabass, growth performance, zinc amino acid complex (ZnAA), zinc sulfate (ZnSO₄)

Procedia PDF Downloads 182
2940 Thermodynamic Analysis of Surface Seawater under Ocean Warming: An Integrated Approach Combining Experimental Measurements, Theoretical Modeling, Machine Learning Techniques, and Molecular Dynamics Simulation for Climate Change Assessment

Authors: Nishaben Desai Dholakiya, Anirban Roy, Ranjan Dey

Abstract:

Understanding ocean thermodynamics has become increasingly critical as Earth's oceans serve as the primary planetary heat regulator, absorbing approximately 93% of excess heat energy from anthropogenic greenhouse gas emissions. This investigation presents a comprehensive analysis of Arabian Sea surface seawater thermodynamics, focusing specifically on heat capacity (Cp) and thermal expansion coefficient (α) - parameters fundamental to global heat distribution patterns. Through high-precision experimental measurements of ultrasonic velocity and density across varying temperature (293.15-318.15K) and salinity (0.5-35 ppt) conditions, it characterize critical thermophysical parameters including specific heat capacity, thermal expansion, and isobaric and isothermal compressibility coefficients in natural seawater systems. The study employs advanced machine learning frameworks - Random Forest, Gradient Booster, Stacked Ensemble Machine Learning (SEML), and AdaBoost - with SEML achieving exceptional accuracy (R² > 0.99) in heat capacity predictions. the findings reveal significant temperature-dependent molecular restructuring: enhanced thermal energy disrupts hydrogen-bonded networks and ion-water interactions, manifesting as decreased heat capacity with increasing temperature (negative ∂Cp/∂T). This mechanism creates a positive feedback loop where reduced heat absorption capacity potentially accelerates oceanic warming cycles. These quantitative insights into seawater thermodynamics provide crucial parametric inputs for climate models and evidence-based environmental policy formulation, particularly addressing the critical knowledge gap in thermal expansion behavior of seawater under varying temperature-salinity conditions.

Keywords: climate change, arabian sea, thermodynamics, machine learning

Procedia PDF Downloads 7
2939 Effect of Collection Technique of Blood on Clinical Pathology

Authors: Marwa Elkalla, E. Ali Abdelfadil, Ali. Mohamed. M. Sami, Ali M. Abdel-Monem

Abstract:

To assess the impact of the blood collection technique on clinical pathology markers and to establish reference intervals, a study was performed using normal, healthy C57BL/6 mice. Both sexes were employed, and they were randomly assigned to different groups depending on the phlebotomy technique used. The blood was drawn in one of four ways: intracardiac (IC), caudal vena cava (VC), caudal vena cava (VC) plus a peritoneal collection of any extravasated blood, or retroorbital phlebotomy (RO). Several serum biochemistries, such as a liver function test, a complete blood count with differentials, and a platelet count, were analysed from the blood and serum samples analysed. Red blood cell count, haemoglobin (p >0.002), hematocrit, alkaline phosphatase, albumin, total protein, and creatinine were all significantly greater in female mice. Platelet counts, specific white blood cell numbers (total, neutrophil, lymphocyte, and eosinophil counts), globulin, amylase, and the BUN/creatinine ratio were all greater in males. The VC approach seemed marginally superior to the IC approach for the characteristics under consideration and was linked to the least variation among both sexes. Transaminase levels showed the greatest variation between study groups. The aspartate aminotransferase (AST) values were linked with decreased fluctuation for the VC approach, but the alanine aminotransferase (ALT) values were similar between the IC and VC groups. There was a lot of diversity and range in transaminase levels between the MC and RO groups. We found that the RO approach, the only one tested that allowed for repeated sample collection, yielded acceptable ALT readings. The findings show that the test results are significantly affected by the phlebotomy technique and that the VC or IC techniques provide the most reliable data. When organising a study and comparing data to reference ranges, the ranges supplied here by collection method and sex can be utilised to determine the best approach to data collection. The authors suggest establishing norms based on the procedures used by each individual researcher in his or her own lab.

Keywords: clinical, pathology, blood, effect

Procedia PDF Downloads 96
2938 A Computerized Tool for Predicting Future Reading Abilities in Pre-Readers Children

Authors: Stephanie Ducrot, Marie Vernet, Eve Meiss, Yves Chaix

Abstract:

Learning to read is a key topic of debate today, both in terms of its implications on school failure and illiteracy and regarding what are the best teaching methods to develop. It is estimated today that four to six percent of school-age children suffer from specific developmental disorders that impair learning. The findings from people with dyslexia and typically developing readers suggest that the problems children experience in learning to read are related to the preliteracy skills that they bring with them from kindergarten. Most tools available to professionals are designed for the evaluation of child language problems. In comparison, there are very few tools for assessing the relations between visual skills and the process of learning to read. Recent literature reports that visual-motor skills and visual-spatial attention in preschoolers are important predictors of reading development — the main goal of this study aimed at improving screening for future reading difficulties in preschool children. We used a prospective, longitudinal approach where oculomotor processes (assessed with the DiagLECT test) were measured in pre-readers, and the impact of these skills on future reading development was explored. The dialect test specifically measures the online time taken to name numbers arranged irregularly in horizontal rows (horizontal time, HT), and the time taken to name numbers arranged in vertical columns (vertical time, VT). A total of 131 preschoolers took part in this study. At Time 0 (kindergarten), the mean VT, HT, errors were recorded. One year later, at Time 1, the reading level of the same children was evaluated. Firstly, this study allowed us to provide normative data for a standardized evaluation of the oculomotor skills in 5- and 6-year-old children. The data also revealed that 25% of our sample of preschoolers showed oculomotor impairments (without any clinical complaints). Finally, the results of this study assessed the validity of the DiagLECT test for predicting reading outcomes; the better a child's oculomotor skills are, the better his/her reading abilities will be.

Keywords: vision, attention, oculomotor processes, reading, preschoolers

Procedia PDF Downloads 147
2937 Organ Dose Calculator for Fetus Undergoing Computed Tomography

Authors: Choonsik Lee, Les Folio

Abstract:

Pregnant patients may undergo CT in emergencies unrelated with pregnancy, and potential risk to the developing fetus is of concern. It is critical to accurately estimate fetal organ doses in CT scans. We developed a fetal organ dose calculation tool using pregnancy-specific computational phantoms combined with Monte Carlo radiation transport techniques. We adopted a series of pregnancy computational phantoms developed at the University of Florida at the gestational ages of 8, 10, 15, 20, 25, 30, 35, and 38 weeks (Maynard et al. 2011). More than 30 organs and tissues and 20 skeletal sites are defined in each fetus model. We calculated fetal organ dose-normalized by CTDIvol to derive organ dose conversion coefficients (mGy/mGy) for the eight fetuses for consequential slice locations ranging from the top to the bottom of the pregnancy phantoms with 1 cm slice thickness. Organ dose from helical scans was approximated by the summation of doses from multiple axial slices included in the given scan range of interest. We then compared dose conversion coefficients for major fetal organs in the abdominal-pelvis CT scan of pregnancy phantoms with the uterine dose of a non-pregnant adult female computational phantom. A comprehensive library of organ conversion coefficients was established for the eight developing fetuses undergoing CT. They were implemented into an in-house graphical user interface-based computer program for convenient estimation of fetal organ doses by inputting CT technical parameters as well as the age of the fetus. We found that the esophagus received the least dose, whereas the kidneys received the greatest dose in all fetuses in AP scans of the pregnancy phantoms. We also found that when the uterine dose of a non-pregnant adult female phantom is used as a surrogate for fetal organ doses, root-mean-square-error ranged from 0.08 mGy (8 weeks) to 0.38 mGy (38 weeks). The uterine dose was up to 1.7-fold greater than the esophagus dose of the 38-week fetus model. The calculation tool should be useful in cases requiring fetal organ dose in emergency CT scans as well as patient dose monitoring.

Keywords: computed tomography, fetal dose, pregnant women, radiation dose

Procedia PDF Downloads 140
2936 Optimizing the Insertion of Renewables in the Colombian Power Sector

Authors: Felipe Henao, Yeny Rodriguez, Juan P. Viteri, Isaac Dyner

Abstract:

Colombia is rich in natural resources and greatly focuses on the exploitation of water for hydroelectricity purposes. Alternative cleaner energy sources, such as solar and wind power, have been largely neglected despite: a) its abundance, b) the complementarities between hydro, solar and wind power, and c) the cost competitiveness of renewable technologies. The current limited mix of energy sources creates considerable weaknesses for the system, particularly when facing extreme dry weather conditions, such as El Niño event. In the past, El Niño have exposed the truly consequences of a system heavily dependent on hydropower, i.e. loss of power supply, high energy production costs, and loss of overall competitiveness for the country. Nonetheless, it is expected that the participation of hydroelectricity will increase in the near future. In this context, this paper proposes a stochastic lineal programming model to optimize the insertion of renewable energy systems (RES) into the Colombian electricity sector. The model considers cost-based generation competition between traditional energy technologies and alternative RES. This work evaluates the financial, environmental, and technical implications of different combinations of technologies. Various scenarios regarding the future evolution of costs of the technologies are considered to conduct sensitivity analysis of the solutions – to assess the extent of the participation of the RES in the Colombian power sector. Optimization results indicate that, even in the worst case scenario, where costs remain constant, the Colombian power sector should diversify its portfolio of technologies and invest strongly in solar and wind power technologies. The diversification through RES will contribute to make the system less vulnerable to extreme weather conditions, reduce the overall system costs, cut CO2 emissions, and decrease the chances of having national blackout events in the future. In contrast, the business as usual scenario indicates that the system will turn more costly and less reliable.

Keywords: energy policy and planning, stochastic programming, sustainable development, water management

Procedia PDF Downloads 296
2935 Poverty Alleviation and Agricultural Management Policies in Nasarawa State of Nigeria: Lessons from the Roots and Tuber Crops Expansion for Increased Food Production (1996-2011)

Authors: Yahaya Abdullahi Adadu, Canice Erunke Esidene

Abstract:

The problems of socio-economic development have been a major challenge bedeviling the Nigerian post-colonial state since her political independence from Britain in October I,1960. Critics have argued that the dilemma of Nigeria’s economic survival started since the early 1970s when the agricultural sector which supposedly was the economic mainstay has been literally substituted with the gains of the oil petro-dollars coming from the foreign exchange earnings. Agriculture therefore, which used to be a major player in terms of human and national upliftment in Nigeria have been given a back seat while oil and gas has taken over the front burner in virtually every aspect of Nigeria’s national life. This study is therefore an exposition of the efforts of the Nasarawa state government in reversing the dangerous trend in which the over reliance on oil wealth has caused to persons, individuals and groups in terms of the prevailing levels of poverty and other attendant vices therein. The study focuses on the management policies of the various regimes in the state since its inception in 1996, with particular reference to the regime types-military and civilian alike in propelling the needed policy change, which could transform the economy in line with international best practices. Particular emphasis will be paid to the BADA-KOSHI agricultural scheme whose interest was to recover the lost glory of rural agriculture through series of roots and tuber expansion, and particularly such crops as yam minissetts, cassava, sweet potatoes and coco-yam, respectively. The paper covers the period between 1996 -2011, a period considered to be critical in the agricultural revolution of the state. The study adopts a theoretical approach via secondary methods of analysis for the efficient explanations of the burning issues under consideration. The paper sums up with policy recommendations and conclusion.

Keywords: poverty, agriculture, Badakoshi, rural policy management

Procedia PDF Downloads 445
2934 Effect of Anionic Lipid on Zeta Potential Values and Physical Stability of Liposomal Amikacin

Authors: Yulistiani, Muhammad Amin, Fasich

Abstract:

A surface charge of the nanoparticle is a very important consideration in pulmonal drug delivery system. The zeta potential (ZP) is related to the surface charge which can predict stability of nanoparticles as nebules of liposomal amikacin. Anionic lipid such as 1,2-dipalmitoyl-sn-glycero-3-phosphatidylglycerol (DPPG) is expected to contribute to the physical stability of liposomal amikacin and the optimal ZP value. Suitable ZP can improve drug release profiles at specific sites in alveoli as well as their stability in dosage form. This study aimed to analyze the effect of DPPG on ZP values and physical stability of liposomal amikacin. Liposomes were prepared by using the reserved phase evaporation method. Liposomes consisting of DPPG, 1,2-dipalmitoyl-sn-glycero-3-phosphatidylcholine (DPPC), cholesterol and amikacin were formulated in five different compositions 0/150/5/100, 10//150/5/100, 20/150/5/100, 30/150/5/100 and 40/150/5/100 (w/v) respectively. A chloroform/methanol mixture in the ratio of 1 : 1 (v/v) was used as solvent to dissolve lipids. These systems were adjusted in the phosphate buffer at pH 7.4. Nebules of liposomal amikacin were produced by using the vibrating nebulizer and then characterized by the X-ray diffraction, differential scanning calorimetry, particle size and zeta potential analyzer, and scanning electron microscope. Amikacin concentration from liposome leakage was determined by the immunoassay method. The study revealed that presence of DPPG could increase the ZP value. The addition of 10 mg DPPG in the composition resulted in increasing of ZP value to 3.70 mV (negatively charged). The optimum ZP value was reached at -28.780 ± 0.70 mV and particle size of nebules 461.70 ± 21.79 nm. Nebulizing process altered parameters such as particle size, conformation of lipid components and the amount of surface charges of nanoparticles which could influence the ZP value. These parameters might have profound effects on the application of nebules in the alveoli; however, negatively charge nanoparticles were unexpected to have a high ZP value in this system due to increased macrophage uptake and pulmonal clearance. Therefore, the ratio of liposome 20/150/5/100 (w/v) resulted in the most stable colloidal system and might be applicable to pulmonal drug delivery system.

Keywords: anionic lipid, dipalmitoylphosphatidylglycerol, liposomal amikacin, stability, zeta potential

Procedia PDF Downloads 339
2933 Effect of Mistranslating tRNA Alanine on Polyglutamine Aggregation

Authors: Sunidhi Syal, Rasangi Tennakoon, Patrick O'Donoghue

Abstract:

Polyglutamine (polyQ) diseases are a group of diseases related to neurodegeneration caused by repeats of the amino acid glutamine (Q) in the DNA, which translates into an elongated polyQ tract in the protein. The pathological explanation is that the polyQ tract forms cytotoxic aggregates in the neurons, leading to their degeneration. There are no cures or preventative efforts established for these diseases as of today, although the symptoms of these diseases can be relieved. This study specifically focuses on Huntington's disease, which is a type of polyQ disease in which aggregation is caused by the extended cytosine, adenine, guanine (CUG) codon repeats in the huntingtin (HTT) gene, which encodes for the huntingtin protein. Using this principle, we attempted to create six models, which included mutating wildtype tRNA alanine variant tRNA-AGC-8-1 to have glutamine anticodons CUG and UUG so serine is incorporated at glutamine sites in poly Q tracts. In the process, we were successful in obtaining tAla-8-1 CUG mutant clones in the HTTexon1 plasmids with a polyQ tract of 23Q (non-pathogenic model) and 74Q (disease model). These plasmids were transfected into mouse neuroblastoma cells to characterize protein synthesis and aggregation in normal and mistranslating cells and to investigate the effects of glutamines replaced with alanines on the disease phenotype. Notably, we observed no noteworthy differences in mean fluorescence between the CUG mutants for 23Q or 74Q; however, the Triton X-100 assay revealed a significant reduction in insoluble 74Q aggregates. We were unable to create a tAla-8-1 UUG mutant clone, and determining the difference in the effects of the two glutamine anticodons may enrich our understanding of the disease phenotype. In conclusion, by generating structural disruption with the amino acid alanine, it may be possible to find ways to minimize the toxicity of Huntington's disease caused by these polyQ aggregates. Further research is needed to advance knowledge in this field by identifying the cellular and biochemical impact of specific tRNA variants found naturally in human genomes.

Keywords: Huntington's disease, polyQ, tRNA, anticodon, clone, overlap PCR

Procedia PDF Downloads 41
2932 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 418
2931 Anticancer Effect of Doxorubicin Using Injectable Hydrogel

Authors: Prasamsha Panta, Da Yeon Kim, Ja Yong Jang, Min Jae Kim, Jae Ho Kim, Moon Suk Kim

Abstract:

Introduction: Among the many anticancer drugs used clinically, doxorubicin (Dox), was one of widely used drugs to treat many types of solid tumors such as liver, colon, breast, or lung. Intratumoral injection of chemotherapeutic agents is a potentially more effective alternative to systemic administration because direct delivery of the anticancer drug to the target may improve both the stability and efficacy of anticancer drugs. Injectable in situ-forming gels have attracted considerable attention because they can achieve site specific drug delivery, long term action periods, and improved patient compliance. Objective: Objective of present study is to confirm clinical benefit of intratumoral chemotherapy using injectable in situ-forming poly(ethylene glycol)-b-polycaprolactone diblock copolymer (MP) and Dox with increase in efficacy and reducing the toxicity in patients with cancer diseases. Methods and methodology: We prepared biodegradable MP hydrogel and measured viscosity for the evaluation of thermo-sensitive property. In vivo antitumor activity was performed with normal saline, MP only, single free Dox, repeat free Dox, and Dox-loaded MP gel. The remaining amount of Dox drug was measured using HPLC after the mouse was sacrified. For cytotoxicity studies WST-1 assay was performed. Histological analysis was done with H&E and TUNEL processes respectively. Results: The works in this experiment showed that Dox-loaded MP have biodegradable drug depot property. Dox-loaded MP gels showed remarkable in vitro cytotoxicity activities against cancer cells. Finally, this work indicates that injection of Dox-loaded MP allowed Dox to act effectively in the tumor and induced long-lasting supression of tumor growth. Conclusion: This work has examined the potential clinical utility of intratumorally injected Dox-loaded MP gel, which shows significant effect of higher local Dox retention compared with systemically administered Dox.

Keywords: injectable in-situ forming hydrogel, anticancer, doxorubicin, intratumoral injection

Procedia PDF Downloads 408
2930 Pattern of External Injuries Sustained during Bomb Blast Attacks in Karachi, Pakistan from 2000 to 2007

Authors: Arif Anwar Surani, Salman Ali, Asif Surani, Sohaib Zahid, Akbar Shoukat Ali, Zeeshan-Ul-Hassan Usmani, Joseph Varon, Salim Surani

Abstract:

Objective: Terrorism and suicidal bomb blast attacks are commonplace in Karachi, Pakistan. During the years 2000 to 2007, there were over 60 bomb explosions resulting in more than 1500 casualties. These explosions produce a wide variety of external injuries. We undertook this study to evaluate pattern of external injury produced after bomb blast attacks and to compare injury profile resulting from explosions in open versus semi-confined blast environments. Method: A retrospective, cross-sectional, study was conducted to review injuries sustained after bomb blast attacks in Karachi, Pakistan, from January 2000 to October 2007. Emergency medical records and medico legal certificates of patients presented to three major public sector hospitals of Karachi were evaluated using self-design proforma. Results: Data of 481 victims meet inclusion criteria and were incorporated for final analysis. Of these, 63.6% were injured in open spaces and 36.4% were injured in semi-confined blast environments. Lacerations were commonly encountered as external injury (47.7%) followed by penetrating wounds (15.3%). Lower and upper extremities were most commonly affected (38.6% and 19% respectively). Open and semi-confined blast environments produced a specific injury pattern and profile (p=<0.001). Conclusions: Bomb blast attacks in Karachi produce an external injury pattern consistent with other studies, with exception of an increased frequency in penetrating wounds. Semi-confined blast environments were associated with severe injuries. Further studies are required to better classify injuries and their severity based on standardized scoring systems. Effective emergency response systems must be designed to cope with mass causalities following bomb explosions.

Keywords: bomb blast attacks, injury pattern, external injury, open space, semi-confined space, blast environment

Procedia PDF Downloads 397
2929 Automatic Processing of Trauma-Related Visual Stimuli in Female Patients Suffering From Post-Traumatic Stress Disorder after Interpersonal Traumatization

Authors: Theresa Slump, Paula Neumeister, Katharina Feldker, Carina Y. Heitmann, Thomas Straube

Abstract:

A characteristic feature of post-traumatic stress disorder (PTSD) is the automatic processing of disorder-specific stimuli that expresses itself in intrusive symptoms such as intense physical and psychological reactions to trauma-associated stimuli. That automatic processing plays an essential role in the development and maintenance of symptoms. The aim of our study was, therefore, to investigate the behavioral and neural correlates of automatic processing of trauma-related stimuli in PTSD. Although interpersonal traumatization is a form of traumatization that often occurs, it has not yet been sufficiently studied. That is why, in our study, we focused on patients suffering from interpersonal traumatization. While previous imaging studies on PTSD mainly used faces, words, or generally negative visual stimuli, our study presented complex trauma-related and neutral visual scenes. We examined 19 female subjects suffering from PTSD and examined 19 healthy women as a control group. All subjects did a geometric comparison task while lying in a functional-magnetic-resonance-imaging (fMRI) scanner. Trauma-related scenes and neutral visual scenes that were not relevant to the task were presented while the subjects were doing the task. Regarding the behavioral level, there were not any significant differences between the task performance of the two groups. Regarding the neural level, the PTSD patients showed significant hyperactivation of the hippocampus for task-irrelevant trauma-related stimuli versus neutral stimuli when compared with healthy control subjects. Connectivity analyses revealed altered connectivity between the hippocampus and other anxiety-related areas in PTSD patients, too. Overall, those findings suggest that fear-related areas are involved in PTSD patients' processing of trauma-related stimuli even if the stimuli that were used in the study were task-irrelevant.

Keywords: post-traumatic stress disorder, automatic processing, hippocampus, functional magnetic resonance imaging

Procedia PDF Downloads 199
2928 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes

Authors: Misra Ayse Adsiz, Selim Selvi

Abstract:

In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.

Keywords: agile, design, missile, scrum

Procedia PDF Downloads 168
2927 Sizing of Drying Processes to Optimize Conservation of the Nuclear Power Plants on Stationary

Authors: Assabo Mohamed, Bile Mohamed, Ali Farah, Isman Souleiman, Olga Alos Ramos, Marie Cadet

Abstract:

The life of a nuclear power plant is regularly punctuated by short or long period outages to carry out maintenance operations and/or nuclear fuel reloading. During these stops periods, it is essential to conserve all the secondary circuit equipment to avoid corrosion priming. This kind of circuit is one of the main components of a nuclear reactor. Indeed, the conservation materials on shutdown of a nuclear unit improve circuit performance and reduce the maintenance cost considerably. This study is a part of the optimization of the dry preservation of equipment from the water station of the nuclear reactor. The main objective is to provide tools to guide Electricity Production Nuclear Centre (EPNC) in order to achieve the criteria required by the chemical specifications of conservation materials. A theoretical model of drying exchangers of water station is developed by the software Engineering Equation Solver (EES). It used to size requirements and air quality needed for dry conservation of equipment. This model is based on heat transfer and mass transfer governing the drying operation. A parametric study is conducted to know the influence of aerothermal factor taking part in the drying operation. The results show that the success of dry conservation of equipment of the secondary circuit of nuclear reactor depends strongly on the draining, the quality of drying air and the flow of air injecting in the secondary circuit. Finally, theoretical case study performed on EES highlights the importance of mastering the entire system to balance the air system to provide each exchanger optimum flow depending on its characteristics. From these results, recommendations to nuclear power plants can be formulated to optimize drying practices and achieve good performance in the conservation of material from the water at the stop position.

Keywords: dry conservation, optimization, sizing, water station

Procedia PDF Downloads 262
2926 European Standardization in Nanotechnologies and Relation with International Work: The Standardization Can Help Industry and Regulators in Developing Safe Products

Authors: Patrice Conner

Abstract:

Nanotechnologies have enormous potential to contribute to human flourishing in responsible and sustainable ways. They are rapidly developing field of science, technology and innovation. As enabling technologies, their full scope of applications is potentially very wide. Major implications are expected in many areas, e.g. healthcare, information and communication technologies, energy production and storage, materials science/chemical engineering, manufacturing, environmental protection, consumer products, etc. However, nanotechnologies are unlikely to realize their full potential unless their associated societal and ethical issues are adequately attended. Namely nanotechnologies and nanoparticles may expose humans and the environment to new health risks, possibly involving quite different mechanisms of interference with the physiology of human and environmental species. One of the building blocks of the ‘safe, integrated and responsible’ approach is standardization. Both the Economic and Social Committee and the European Parliament have highlighted the importance to be attached to standardization as a means to accompany the introduction on the market of nanotechnologies and nanomaterials, and a means to facilitate the implementation of regulation. ISO and CEN have respectively started in 2005 and 2006 to deal with selected topics related to this emerging and enabling technology. In the beginning of 2010, EC DG ‘Enterprise and Industry’ addressed the mandate M/461 to CEN, CENELEC and ETSI for standardization activities regarding nanotechnologies and nanomaterials. Thus CEN/TC 352 ‘Nanotechnologies’ has been asked to take the leadership for the coordination in the execution of M/461 (46 topics to be standardized) and to contact relevant European and International Technical committees and interested stakeholders as appropriate (56 structures have been identified). Prior requests from M/461 deal with characterization and exposure of nanomaterials and any matters related to Health, Safety and Environment. Answers will be given to: - What are the structures and how they work? - Where are we right now and how work is going from now onwards? - How CEN’s work and targets deal with and interact with global matters in this field?

Keywords: characterization, environmental protection, exposure, health risks, nanotechnologies, responsible and sustainable ways, safety

Procedia PDF Downloads 188
2925 Utilization of Agro-wastes for Biotechnological Production of Edible Mushroom

Authors: Salami Abiodun Olusola, Bankole Faith Ayobami

Abstract:

Agro-wastes are wastes produced from various agricultural activities and include manures, corncob, plant stalks, hulls, leaves, sugarcane bagasse, oil-palm spadix, and rice bran. In farming situation, the agro-waste is often useless and, thus, discarded. Huge quantities of waste resources generated from Nigerian agriculture could be converted to more useful forms of energy, which could contribute to the country’s primary energy needs and reduce problems associated with waste management. Accumulation of agro-wastes may cause health, safety, and environmental concern. However, biotechnological use of agro-waste could enhance food security through its bioconversion to useful renewable energy. Mushrooms are saprophytes which feed by secreting extracellular enzymes, digesting food externally, and absorb the nutrients in net-like hyphae. Therefore, mushrooms could be exploited for bioconversion of the cheap and numerous agro-wastes for providing nutritious food for animals, human and carbon recycling. The study investigated the bioconversion potentials of Pleurotus florida on agro-wastes using a simple and cost-effective biotechnological method. Four agro-wastes; corncobs, oil-palm spadix, corn straw, and sawdust, were composted and used as substrates while the biological efficiency (BE) and the nutritional composition of P. florida grown on the substrates were determined. Pleurotus florida contained 26.28-29.91% protein, 86.90-89.60% moisture, 0.48-0.91% fat, 19.64-22.82% fibre, 31.37-38.17% carbohydrate and 5.18-6.39% ash. The mineral contents ranged from 342-410 mg/100g Calcium, 1009-1133 mg/100g Phosphorus, 17-21 mg/100g Iron, 277-359 mg/100g Sodium, and 2088-2281 mg/100g Potassium. The highest yield and BE were obtained on corncobs (110 g, 55%), followed by oil-palm spadix (76.05 g, 38%), while the least BE was recorded on corn straw substrate (63.12 g, 31.56%). Utilization of the composted substrates yielded nutritional and edible mushrooms. The study presents biotechnological procedure for bioconversion of agro-wastes to edible and nutritious mushroom for efficient agro-wastes’ management, utilization, and recycling.

Keywords: agrowaste, bioconversion, biotechnology, utilization, recycling

Procedia PDF Downloads 78
2924 Beyond Information Failure and Misleading Beliefs in Conditional Cash Transfer Programs: A Qualitative Account of Structural Barriers Explaining Why the Poor Do Not Invest in Human Capital in Northern Mexico

Authors: Francisco Fernandez de Castro

Abstract:

The Conditional Cash Transfer (CCT) model gives monetary transfers to beneficiary families on the condition that they take specific education and health actions. According to the economic rationale of CCTs the poor need incentives to invest in their human capital because they are trapped by a lack of information and misleading beliefs. If left to their own decision, the poor will not be able to choose what is in their best interests. The basic assumption of the CCT model is that the poor need incentives to take care of their own education and health-nutrition. Due to the incentives (income cash transfers and conditionalities), beneficiary families are supposed to attend doctor visits and health talks. Children would stay in the school. These incentivized behaviors would produce outcomes such as better health and higher level of education, which in turn will reduce poverty. Based on a grounded theory approach to conduct a two-year period of qualitative data collection in northern Mexico, this study shows that this explanation is incomplete. In addition to the information failure and inadequate beliefs, there are structural barriers in everyday life of households that make health-nutrition and education investments difficult. In-depth interviews and observation work showed that the program takes for granted local conditions in which beneficiary families should fulfill their co-responsibilities. Data challenged the program’s assumptions and unveiled local obstacles not contemplated in the program’s design. These findings have policy and research implications for the CCT agenda. They bring elements for late programming due to the gap between the CCT strategy as envisioned by policy designers, and the program that beneficiary families experience on the ground. As for research consequences, these findings suggest new avenues for scholarly work regarding the causal mechanisms and social processes explaining CCT outcomes.

Keywords: conditional cash transfers, incentives, poverty, structural barriers

Procedia PDF Downloads 113
2923 The Role of Group Dynamics in Creativity: A Study Case from Italy

Authors: Sofya Komarova, Frashia Ndungu, Alessia Gavazzoli, Roberta Mineo

Abstract:

Modern society requires people to be flexible and to develop innovative solutions to unexpected situations. Creativity refers to the “interaction among aptitude, process, and the environment by which an individual or group produces a perceptible product that is both novel and useful as defined within a social context”. It allows humans to produce novel ideas, generate new solutions, and express themselves uniquely. Only a few scientific studies have examined group dynamics' influence on individuals' creativity. There exist some gaps in the research on creative thinking, such as the fact that collaborative effort frequently results in the enhanced production of new information and knowledge. Therefore, it is critical to evaluate creativity via social settings. The study aimed at exploring the group dynamics of young adults in small group settings and the influence of these dynamics on their creativity. The study included 30 participants aged 20 to 25 who were attending university after completing a bachelor's degree. The participants were divided into groups of three, in gender homogenous and heterogeneous groups. The groups’ creative task was tied to the Lego mosaic created for the Scintillae laboratory at the Reggio Children Foundation. Group dynamics were operationalized into patterns of behaviors classified into three major categories: 1) Social Interactions, 2) Play, and 3) Distraction. Data were collected through audio and video recording and observation. The qualitative data were converted into quantitative data using the observational coding system; then, they were analyzed, revealing correlations between behaviors using median points and averages. For each participant and group, the percentages of represented behavior signals were computed. The findings revealed a link between social interaction, creative thinking, and creative activities. Other findings revealed that the more intense the social interaction, the lower the amount of creativity demonstrated. This study bridges the research gap between group dynamics and creativity. The approach calls for further research on the relationship between creativity and social interaction.

Keywords: group dynamics, creative thinking, creative action, social interactions, group play

Procedia PDF Downloads 127
2922 Proposals of Exposure Limits for Infrasound From Wind Turbines

Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz

Abstract:

Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).

Keywords: infrasound, exposure limit, hearing thresholds, wind turbines

Procedia PDF Downloads 83
2921 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies

Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan

Abstract:

The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.

Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping

Procedia PDF Downloads 98
2920 Safety Profile of Anti-Retroviral Medicine in South Africa Based on Reported Adverse Drug Reactions

Authors: Sarah Gounden, Mukesh Dheda, Boikhutso Tlou, Elizabeth Ojewole, Frasia Oosthuizen

Abstract:

Background: Antiretroviral therapy (ART) has been effective in the reduction of mortality and resulted in an improvement in the prognosis of HIV-infected patients. However, treatment with antiretrovirals (ARVs) has led to the development of many adverse drug reactions (ADRs). It is, therefore, necessary to determine the safety profile of these medicines in a South African population in order to ensure safe and optimal medicine use. Objectives: The aim of this study was to quantify ADRs experienced with the different ARVs currently used in South Africa, to determine the safety profile of ARV medicine in South Africa based on reported ADRs, and to determine the ARVs with the lowest risk profile based on specific patient populations. Methodology: This was a quantitative study. Individual case safety reports for the period January 2010 – December 2013 were obtained from the National Pharmacovigilance Center; these reports contained information on ADRs, ARV medicine, and patient demographics. Data was analysed to find associations that may exist between ADRs experienced, ARV medicines used and patient demographics. Results: A total of 1916 patient reports were received of which 1534 met the inclusion criteria for the study. The ARV with the lowest risk of ADRs were found to be lamivudine (0.51%, n=12), followed by lopinavir/ritonavir combination (0.8%, n=19) and abacavir (0.64%, n=15). A higher incidence of ADRs was observed in females compared to males. The age group 31–50 years and the weight group 61–80 kg had the highest incidence of ADRs reported. Conclusion: This study found that the safest ARVs to be used in a South African population are lamivudine, abacavir, and the lopinavir/ritonavir combination. Gender differences play a significant role in the occurrence of ADRs and both anatomical and physiological differences account for this. An increased BMI (body mass index) in both men and women showed an increase in the incidence of ADRs associated with ARV therapy.

Keywords: adverse drug reaction, antiretrovirals, HIV/AIDS, pharmacovigilance, South Africa

Procedia PDF Downloads 351
2919 Alternative Ways of Knowing and the Construction of a Department Around a Common Critical Lens

Authors: Natalie Delia

Abstract:

This academic paper investigates the transformative potential of incorporating alternative ways of knowing within the framework of Critical Studies departments. Traditional academic paradigms often prioritize empirical evidence and established methodologies, potentially limiting the scope of critical inquiry. In response to this, our research seeks to illuminate the benefits and challenges associated with integrating alternative epistemologies, such as indigenous knowledge systems, artistic expressions, and experiential narratives. Drawing upon a comprehensive review of literature and case studies, we examine how alternative ways of knowing can enrich and diversify the intellectual landscape of Critical Studies departments. By embracing perspectives that extend beyond conventional boundaries, departments may foster a more inclusive and holistic understanding of critical issues. Additionally, we explore the potential impact on pedagogical approaches, suggesting that alternative ways of knowing can stimulate alternative way of teaching methods and enhance student engagement. Our investigation also delves into the institutional and cultural shifts necessary to support the integration of alternative epistemologies within academic settings. We address concerns related to validation, legitimacy, and the potential clash with established norms, offering insights into fostering an environment that encourages intellectual pluralism. Furthermore, the paper considers the implications for interdisciplinary collaboration and the potential for cultivating a more responsive and socially engaged scholarship. By encouraging a synthesis of diverse perspectives, Critical Studies departments may be better equipped to address the complexities of contemporary issues, encouraging a dynamic and evolving field of study. In conclusion, this paper advocates for a paradigm shift within Critical Studies departments towards a more inclusive and expansive approach to knowledge production. By embracing alternative ways of knowing, departments have the opportunity to not only diversify their intellectual landscape but also to contribute meaningfully to broader societal dialogues, addressing pressing issues with renewed depth and insight.

Keywords: critical studies, alternative ways of knowing, academic department, Wallerstein

Procedia PDF Downloads 72
2918 Contribution of Hydrogen Peroxide in the Selective Aspect of Prostate Cancer Treatment by Cold Atmospheric Plasma

Authors: Maxime Moreau, Silvère Baron, Jean-Marc Lobaccaro, Karine Charlet, Sébastien Menecier, Frédéric Perisse

Abstract:

Cold Atmospheric Plasma (CAP) is an ionized gas generated at atmospheric pressure with the temperature of heavy particles (molecules, ions, atoms) close to the room temperature. Recent studies have shown that both in-vitro and in-vivo plasma exposition to many cancer cell lines are efficient to induce the apoptotic way of cell death. In some other works, normal cell lines seem to be less impacted by plasma than cancer cell lines. This is called selectivity of plasma. It is highly likely that the generated RNOS (Reactive Nitrogen Oxygen Species) in the plasma jet, but also in the medium, play a key-role in this selectivity. In this study, two CAP devices will be compared to electrical power, chemical species composition and their efficiency to kill cancer cells. A particular focus on the action of hydrogen peroxide will be made. The experiments will take place as described next for both devices: electrical and spectroscopic characterization for different voltages, plasma treatment of normal and cancer cells to compare the CAP efficiency between cell lines and to show that death is induced by an oxidative stress. To enlighten the importance of hydrogen peroxide, an inhibitor of H2O2 will be added in cell culture medium before treatment and a comparison will be made between the results of cell viability in this case and those from a simple plasma exposition. Besides, H2O2 production will be measured by only treating medium with plasma. Cell lines will also be exposed to different concentrations of hydrogen peroxide in order to characterize the cytotoxic threshold for cells and to make a comparison with the quantity of H2O2 produced by CAP devices. Finally, the activity of catalase for different cell lines will be quantified. This enzyme is an important antioxidant agent against hydrogen peroxide. A correlation between cells response to plasma exposition and this activity could be a strong argument in favor of the predominant role of H2O2 to explain the selectivity of plasma cancer treatment by cold atmospheric plasma.

Keywords: cold atmospheric plasma, hydrogen peroxide, prostate cancer, selectivity

Procedia PDF Downloads 148
2917 In Silico Study of Cell Surface Structures of Parabacteroides distasonis Involved in Its Maintain Within the Gut Microbiota and Its Potential Pathogenicity

Authors: Jordan Chamarande, Lisiane Cunat, Corentine Alauzet, Catherine Cailliez-Grimal

Abstract:

Gut microbiota (GM) is now considered a new organ mainly due to the microorganism’s specific biochemical interaction with its host. Although mechanisms underlying host-microbiota interactions are not fully described, it is now well-defined that cell surface molecules and structures of the GM play a key role in such relation. The study of surface structures of GM members is also fundamental for their role in the establishment of species in the versatile and competitive environment of the digestive tract and as a potential virulence factor. Among these structures are capsular polysaccharides (CPS), fimbriae, pili and lipopolysaccharides (LPS), all well-described for their central role in microorganism colonization and communication with host epithelium. The health-promoting Parabacteroides distasonis, which is part of the core microbiome, has recently received a lot of attention, showing beneficial properties for its host and as a new potential biotherapeutic product. However, to the best of the authors’ knowledge, the cell surface molecules and structures of P. distasonis that allow its maintain within the GM are not identified. Moreover, although P. distasonis is strongly recognized as intestinal commensal species with benefits for its host, it has also been recognized as an opportunistic pathogen. In this study, we reported gene clusters potentially involved in the synthesis of the capsule, fimbriae-like and pili-like cell surface structures in 26 P. distasonis genomes and applied the new RfbA-Typing classification in order to better understand and characterize the beneficial/pathogenic behaviour related to P. distasonis strains. In context, 2 different types of fimbriae, 3 of pilus and up to 14 capsular polysaccharide loci, have been identified over the 26 genomes studied. Moreover, the addition of data to the rfbA-Type classification modified the outcome by rearranging rfbA genes and adding a fifth group to the classification. In conclusion, the strain variability in terms of external proteinaceous structure could explain the inter-strain differences previously observed in P. distasonis adhesion capacities and its potential pathogenicity.

Keywords: gut microbiota, Parabacteroides distasonis, capsular polysaccharide, fimbriae, pilus, O-antigen, pathogenicity, probiotic, comparative genomics

Procedia PDF Downloads 103
2916 Clinical Empathy: The Opportunity to Offer Optimal Treatment to People with Serious Illness

Authors: Leonore Robieux, Franck Zenasni, Marc Pocard, Clarisse Eveno

Abstract:

Empirical data in health psychology studies show the necessity to consider the doctor-patient communication and its positive impact on outcomes such as patients’ satisfaction, treatment adherence, physical and psychological wellbeing. In this line, the present research aims to define the role and determinants of an effective doctor–patient communication during the treatment of patients with serious illness (peritoneal carcinomatosis). We carried out a prospective longitudinal study including patients treated for peritoneal carcinomatosis of various origins. From November 2016, to date, data were collected using validated questionnaires at two times of evaluation: one month before the surgery (T0) and one month after (T1). Thus, patients reported their (a) anxiety and depression levels, (b) standardized and individualized quality of life and (c) how they perceived communication, attitude and empathy of the surgeon. 105 volunteer patients (Mean age = 58.18 years, SD = 10.24, 62.2% female) participated to the study. PC arose from rare diseases (14%), colorectal (38%), eso-gastric (24%) and ovarian (8%) cancer. Three groups are defined according to the severity of their pathology and the treatment offered to them: (1) important surgical treatment with the goal of healing (53%), (2) repeated palliative surgical treatment (17%), and (3) the patients recused for surgical treatment, only palliative approach (30%). Results are presented according to Baron and Kenny recommendations. The regressions analyses show that only depression and anxiety are sensitive to the communication and empathy of surgeon. The main results show that a good communication and high level of empathy at T0 and T1 limit depression and anxiety of the patients in T1. Results also indicate that the severity of the disease modulates this positive impact of communication: better is the communication the less are the level of depression and anxiety of the patients. This effect is higher for patients treated for the more severe disease. These results confirm that, even in the case severe disease a good communication between patient and physician remains a significant factor in promoting the well-being of patients. More specific training need to be developed to promote empathic care.

Keywords: clinical empathy, determinants, healthcare, psychological wellbeing

Procedia PDF Downloads 122
2915 Optimization of Alkali Assisted Microwave Pretreatments of Sorghum Straw for Efficient Bioethanol Production

Authors: Bahiru Tsegaye, Chandrajit Balomajumder, Partha Roy

Abstract:

The limited supply and related negative environmental consequence of fossil fuels are driving researcher for finding sustainable sources of energy. Lignocellulose biomass like sorghum straw is considered as among cheap, renewable and abundantly available sources of energy. However, lignocellulose biomass conversion to bioenergy like bioethanol is hindered due to the reluctant nature of lignin in the biomass. Therefore, removal of lignin is a vital step for lignocellulose conversion to renewable energy. The aim of this study is to optimize microwave pretreatment conditions using design expert software to remove lignin and to release maximum possible polysaccharides from sorghum straw for efficient hydrolysis and fermentation process. Sodium hydroxide concentration between 0.5-1.5%, v/v, pretreatment time from 5-25 minutes and pretreatment temperature from 120-2000C were considered to depolymerize sorghum straw. The effect of pretreatment was studied by analyzing the compositional changes before and after pretreatments following renewable energy laboratory procedure. Analysis of variance (ANOVA) was used to test the significance of the model used for optimization. About 32.8%-48.27% of hemicellulose solubilization, 53% -82.62% of cellulose release, and 49.25% to 78.29% lignin solubilization were observed during microwave pretreatment. Pretreatment for 10 minutes with alkali concentration of 1.5% and temperature of 1400C released maximum cellulose and lignin. At this optimal condition, maximum of 82.62% of cellulose release and 78.29% of lignin removal was achieved. Sorghum straw at optimal pretreatment condition was subjected to enzymatic hydrolysis and fermentation. The efficiency of hydrolysis was measured by analyzing reducing sugars by 3, 5 dinitrisylicylic acid method. Reducing sugars of about 619 mg/g of sorghum straw were obtained after enzymatic hydrolysis. This study showed a significant amount of lignin removal and cellulose release at optimal condition. This enhances the yield of reducing sugars as well as ethanol yield. The study demonstrates the potential of microwave pretreatments for enhancing bioethanol yield from sorghum straw.

Keywords: cellulose, hydrolysis, lignocellulose, optimization

Procedia PDF Downloads 271
2914 The Emerging Multi-Species Trap Fishery in the Red Sea Waters of Saudi Arabia

Authors: Nabeel M. Alikunhi, Zenon B. Batang, Aymen Charef, Abdulaziz M. Al-Suwailem

Abstract:

Saudi Arabia has a long history of using traps as a traditional fishing gear for catching commercially important demersal, mainly coral reef-associated fish species. Fish traps constitute the dominant small-scale fisheries in Saudi waters of Arabian Gulf (eastern seaboard of Saudi Arabia). Recently, however, traps have been increasingly used along the Saudi Red Sea coast (western seaboard), with a coastline of 1800 km (71%) compared to only 720 km (29%) in the Saudi Gulf region. The production trend for traps indicates a recent increase in catches and percent contribution to traditional fishery landings, thus ascertaining the rapid proliferation of trap fishing along the Saudi Red Sea coast. Reef-associated fish species, mainly groupers (Serranidae), emperors (Lethrinidae), parrotfishes (Scaridae), scads and trevallies (Carangidae), and snappers (Lutjanidae), dominate the trap catches, reflecting the reef-dominated shelf zone in the Red Sea. This ongoing investigation covers following major objectives (i) Baseline studies to characterize trap fishery through landing site visit and interview surveys (ii) Stock assessment by fisheries and biological data obtained through monthly landing site monitoring using fishery operational model by FLBEIA, (iii) Operational impacts, derelict traps assessment and by-catch analysis through bottom-mounted video camera and onboard monitoring (iv) Elucidation of fishing grounds and derelict traps impacts by onboard monitoring, Remotely Operated underwater Vehicle and Autonomous Underwater Vehicle surveys; and (v) Analysis of gear design and operations which covers colonization and deterioration experiments. The progress of this investigation on the impacts of the trap fishery on fish stocks and the marine environment in the Saudi Red Sea region is presented.

Keywords: red sea, Saudi Arabia, fish trap, stock assessment, environmental impacts

Procedia PDF Downloads 350
2913 Aerodynamic Design Optimization Technique for a Tube Capsule That Uses an Axial Flow Air Compressor and an Aerostatic Bearing

Authors: Ahmed E. Hodaib, Muhammed A. Hashem

Abstract:

High-speed transportation has become a growing concern. To increase high-speed efficiencies and minimize power consumption of a vehicle, we need to eliminate the friction with the ground and minimize the aerodynamic drag acting on the vehicle. Due to the complexity and high power requirements of electromagnetic levitation, we make use of the air in front of the capsule, that produces the majority of the drag, to compress it in two phases and inject a proportion of it through small nozzles to make a high-pressure air cushion to levitate the capsule. The tube is partially-evacuated so that the air pressure is optimized for maximum compressor effectiveness, optimum tube size, and minimum vacuum pump power consumption. The total relative mass flow rate of the tube air is divided into two fractions. One is by-passed to flow over the capsule body, ensuring that no chocked flow takes place. The other fraction is sucked by the compressor where it is diffused to decrease the Mach number (around 0.8) to be suitable for the compressor inlet. The air is then compressed and intercooled, then split. One fraction is expanded through a tail nozzle to contribute to generating thrust. The other is compressed again. Bleed from the two compressors is used to maintain a constant air pressure in an air tank. The air tank is used to supply air for levitation. Dividing the total mass flow rate increases the achievable speed (Kantrowitz limit), and compressing it decreases the blockage of the capsule. As a result, the aerodynamic drag on the capsule decreases. As the tube pressure decreases, the drag decreases and the capsule power requirements decrease, however, the vacuum pump consumes more power. That’s why Design optimization techniques are to be used to get the optimum values for all the design variables given specific design inputs. Aerodynamic shape optimization, Capsule and tube sizing, compressor design, diffuser and nozzle expander design and the effect of the air bearing on the aerodynamics of the capsule are to be considered. The variations of the variables are to be studied for the change of the capsule velocity and air pressure.

Keywords: tube-capsule, hyperloop, aerodynamic design optimization, air compressor, air bearing

Procedia PDF Downloads 330