Search results for: clinical quality and safety
999 Attributable Mortality of Nosocomial Infection: A Nested Case Control Study in Tunisia
Authors: S. Ben Fredj, H. Ghali, M. Ben Rejeb, S. Layouni, S. Khefacha, L. Dhidah, H. Said
Abstract:
Background: The Intensive Care Unit (ICU) provides continuous care and uses a high level of treatment technologies. Although developed country hospitals allocate only 5–10% of beds in critical care areas, approximately 20% of nosocomial infections (NI) occur among patients treated in ICUs. Whereas in the developing countries the situation is still less accurate. The aim of our study is to assess mortality rates in ICUs and to determine its predictive factors. Methods: We carried out a nested case-control study in a 630-beds public tertiary care hospital in Eastern Tunisia. We included in the study all patients hospitalized for more than two days in the surgical or medical ICU during the entire period of the surveillance. Cases were patients who died before ICU discharge, whereas controls were patients who survived to discharge. NIs were diagnosed according to the definitions of ‘Comité Technique des Infections Nosocomiales et les Infections Liées aux Soins’ (CTINLIS, France). Data collection was based on the protocol of Rea-RAISIN 2009 of the National Institute for Health Watch (InVS, France). Results: Overall, 301 patients were enrolled from medical and surgical ICUs. The mean age was 44.8 ± 21.3 years. The crude ICU mortality rate was 20.6% (62/301). It was 35.8% for patients who acquired at least one NI during their stay in ICU and 16.2% for those without any NI, yielding an overall crude excess mortality rate of 19.6% (OR= 2.9, 95% CI, 1.6 to 5.3). The population-attributable fraction due to ICU-NI in patients who died before ICU discharge was 23.46% (95% CI, 13.43%–29.04%). Overall, 62 case-patients were compared to 239 control patients for the final analysis. Case patients and control patients differed by age (p=0,003), simplified acute physiology score II (p < 10-3), NI (p < 10-3), nosocomial pneumonia (p=0.008), infection upon admission (p=0.002), immunosuppression (p=0.006), days of intubation (p < 10-3), tracheostomy (p=0.004), days with urinary catheterization (p < 10-3), days with CVC ( p=0.03), and length of stay in ICU (p=0.003). Multivariate analysis demonstrated 3 factors: age older than 65 years (OR, 5.78 [95% CI, 2.03-16.05] p=0.001), duration of intubation 1-10 days (OR, 6.82 [95% CI, [1.90-24.45] p=0.003), duration of intubation > 10 days (OR, 11.11 [95% CI, [2.85-43.28] p=0.001), duration of CVC 1-7 days (OR, 6.85[95% CI, [1.71-27.45] p=0.007) and duration of CVC > 7 days (OR, 5.55[95% CI, [1.70-18.04] p=0.004). Conclusion: While surveillance provides important baseline data, successful trials with more active intervention protocols, adopting multimodal approach for the prevention of nosocomial infection incited us to think about the feasibility of similar trial in our context. Therefore, the implementation of an efficient infection control strategy is a crucial step to improve the quality of care.Keywords: intensive care unit, mortality, nosocomial infection, risk factors
Procedia PDF Downloads 406998 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 125997 Design and Assessment of Base Isolated Structures under Spectrum-Compatible Bidirectional Earthquakes
Authors: Marco Furinghetti, Alberto Pavese, Michele Rinaldi
Abstract:
Concave Surface Slider devices have been more and more used in real applications for seismic protection of both bridge and building structures. Several research activities have been carried out, in order to investigate the lateral response of such a typology of devices, and a reasonably high level of knowledge has been reached. If radial analysis is performed, the frictional force is always aligned with respect to the restoring force, whereas under bidirectional seismic events, a bi-axial interaction of the directions of motion occurs, due to the step-wise projection of the main frictional force, which is assumed to be aligned to the trajectory of the isolator. Nonetheless, if non-linear time history analyses have to be performed, standard codes provide precise rules for the definition of an averagely spectrum-compatible set of accelerograms in radial conditions, whereas for bidirectional motions different combinations of the single components spectra can be found. Moreover, nowadays software for the adjustment of natural accelerograms are available, which lead to a higher quality of spectrum-compatibility and to a smaller dispersion of results for radial motions. In this endeavor a simplified design procedure is defined, for building structures, base-isolated by means of Concave Surface Slider devices. Different case study structures have been analyzed. In a first stage, the capacity curve has been computed, by means of non-linear static analyses on the fixed-base structures: inelastic fiber elements have been adopted and different direction angles of lateral forces have been studied. Thanks to these results, a linear elastic Finite Element Model has been defined, characterized by the same global stiffness of the linear elastic branch of the non-linear capacity curve. Then, non-linear time history analyses have been performed on the base-isolated structures, by applying seven bidirectional seismic events. The spectrum-compatibility of bidirectional earthquakes has been studied, by considering different combinations of single components and adjusting single records: thanks to the proposed procedure, results have shown a small dispersion and a good agreement in comparison to the assumed design values.Keywords: concave surface slider, spectrum-compatibility, bidirectional earthquake, base isolation
Procedia PDF Downloads 292996 Standardizing and Achieving Protocol Objectives for ChestWall Radiotherapy Treatment Planning Process using an O-ring Linac in High-, Low- and Middle-income Countries
Authors: Milton Ixquiac, Erick Montenegro, Francisco Reynoso, Matthew Schmidt, Thomas Mazur, Tianyu Zhao, Hiram Gay, Geoffrey Hugo, Lauren Henke, Jeff Michael Michalski, Angel Velarde, Vicky de Falla, Franky Reyes, Osmar Hernandez, Edgar Aparicio Ruiz, Baozhou Sun
Abstract:
Purpose: Radiotherapy departments in low- and middle-income countries (LMICs) like Guatemala have recently introduced intensity-modulated radiotherapy (IMRT). IMRT has become the standard of care in high-income countries (HIC) due to reduced toxicity and improved outcomes in some cancers. The purpose of this work is to show the agreement between the dosimetric results shown in the Dose Volume Histograms (DVH) to the objectives proposed in the adopted protocol. This is the initial experience with an O-ring Linac. Methods and Materials: An O-Linac Linac was installed at our clinic in Guatemala in 2019 and has been used to treat approximately 90 patients daily with IMRT. This Linac is a completely Image Guided Device since to deliver each radiotherapy session must take a Mega Voltage Cone Beam Computerized Tomography (MVCBCT). In each MVCBCT, the Linac deliver 9 UM, and they are taken into account while performing the planning. To start the standardization, the TG263 was employed in the nomenclature and adopted a hypofractionated protocol to treat ChestWall, including supraclavicular nodes achieving 40.05Gy in 15 fractions. The planning was developed using 4 semiarcs from 179-305 degrees. The planner must create optimization volumes for targets and Organs at Risk (OARs); the difficulty for the planner was the dose base due to the MVCBCT. To evaluate the planning modality, we used 30 chestwall cases. Results: The plans created manually achieve the protocol objectives. The protocol objectives are the same as the RTOG1005, and the DHV curves look clinically acceptable. Conclusions: Despite the O-ring Linac doesn´t have the capacity to obtain kv images, the cone beam CT was created using MV energy, the dose delivered by the daily image setup process still without affect the dosimetric quality of the plans, and the dose distribution is acceptable achieving the protocol objectives.Keywords: hypofrationation, VMAT, chestwall, radiotherapy planning
Procedia PDF Downloads 118995 Effect of Cutting Tools and Working Conditions on the Machinability of Ti-6Al-4V Using Vegetable Oil-Based Cutting Fluids
Authors: S. Gariani, I. Shyha
Abstract:
Cutting titanium alloys are usually accompanied with low productivity, poor surface quality, short tool life and high machining costs. This is due to the excessive generation of heat at the cutting zone and difficulties in heat dissipation due to relatively low heat conductivity of this metal. The cooling applications in machining processes are crucial as many operations cannot be performed efficiently without cooling. Improving machinability, increasing productivity, enhancing surface integrity and part accuracy are the main advantages of cutting fluids. Conventional fluids such as mineral oil-based, synthetic and semi-synthetic are the most common cutting fluids in the machining industry. Although, these cutting fluids are beneficial in the industries, they pose a great threat to human health and ecosystem. Vegetable oils (VOs) are being investigated as a potential source of environmentally favourable lubricants, due to a combination of biodegradability, good lubricous properties, low toxicity, high flash points, low volatility, high viscosity indices and thermal stability. Fatty acids of vegetable oils are known to provide thick, strong, and durable lubricant films. These strong lubricating films give the vegetable oil base stock a greater capability to absorb pressure and high load carrying capacity. This paper details preliminary experimental results when turning Ti-6Al-4V. The impact of various VO-based cutting fluids, cutting tool materials, working conditions was investigated. The full factorial experimental design was employed involving 24 tests to evaluate the influence of process variables on average surface roughness (Ra), tool wear and chip formation. In general, Ra varied between 0.5 and 1.56 µm and Vasco1000 cutting fluid presented comparable performance with other fluids in terms of surface roughness while uncoated coarse grain WC carbide tool achieved lower flank wear at all cutting speeds. On the other hand, all tools tips were subjected to uniform flank wear during whole cutting trails. Additionally, formed chip thickness ranged between 0.1 and 0.14 mm with a noticeable decrease in chip size when higher cutting speed was used.Keywords: cutting fluids, turning, Ti-6Al-4V, vegetable oils, working conditions
Procedia PDF Downloads 279994 Flood Mapping Using Height above the Nearest Drainage Model: A Case Study in Fredericton, NB, Canada
Authors: Morteza Esfandiari, Shabnam Jabari, Heather MacGrath, David Coleman
Abstract:
Flood is a severe issue in different places in the world as well as the city of Fredericton, New Brunswick, Canada. The downtown area of Fredericton is close to the Saint John River, which is susceptible to flood around May every year. Recently, the frequency of flooding seems to be increased, especially after the fact that the downtown area and surrounding urban/agricultural lands got flooded in two consecutive years in 2018 and 2019. In order to have an explicit vision of flood span and damage to affected areas, it is necessary to use either flood inundation modelling or satellite data. Due to contingent availability and weather dependency of optical satellites, and limited existing data for the high cost of hydrodynamic models, it is not always feasible to rely on these sources of data to generate quality flood maps after or during the catastrophe. Height Above the Nearest Drainage (HAND), a state-of-the-art topo-hydrological index, normalizes the height of a basin based on the relative elevation along with the stream network and specifies the gravitational or the relative drainage potential of an area. HAND is a relative height difference between the stream network and each cell on a Digital Terrain Model (DTM). The stream layer is provided through a multi-step, time-consuming process which does not always result in an optimal representation of the river centerline depending on the topographic complexity of that region. HAND is used in numerous case studies with quite acceptable and sometimes unexpected results because of natural and human-made features on the surface of the earth. Some of these features might cause a disturbance in the generated model, and consequently, the model might not be able to predict the flow simulation accurately. We propose to include a previously existing stream layer generated by the province of New Brunswick and benefit from culvert maps to improve the water flow simulation and accordingly the accuracy of HAND model. By considering these parameters in our processing, we were able to increase the accuracy of the model from nearly 74% to almost 92%. The improved model can be used for generating highly accurate flood maps, which is necessary for future urban planning and flood damage estimation without any need for satellite imagery or hydrodynamic computations.Keywords: HAND, DTM, rapid floodplain, simplified conceptual models
Procedia PDF Downloads 151993 Development of Oral Biphasic Drug Delivery System Using a Natural Resourced Polymer, Terminalia catappa
Authors: Venkata Srikanth Meka, Nur Arthirah Binti Ahmad Tarmizi Tan, Muhammad Syahmi Bin Md Nazir, Adinarayana Gorajana, Senthil Rajan Dharmalingam
Abstract:
Biphasic drug delivery systems are designed to release drug at two different rates, either fast/prolonged or prolonged/fast. A fast/prolonged release system provides a burst drug release at initial stage followed by a slow release over a prolonged period of time and in case of prolonged/fast release system, the release pattern is vice versa. Terminalia catappa gum (TCG) is a natural polymer and was successfully proven as a novel pharmaceutical excipient. The main objective of the present research is to investigate the applicability of natural polymer, Terminalia catappa gum in the design of oral biphasic drug delivery system in the form of mini tablets by using a model drug, buspirone HCl. This investigation aims to produce a biphasic release drug delivery system of buspirone by combining immediate release and prolonged release mini tablets into a capsule. For immediate release mini tablets, a dose of 4.5 mg buspirone was prepared by varying the concentration of superdisintegrant; crospovidone. On the other hand, prolonged release mini tablets were produced by using different concentrations of the natural polymer; TCG with a buspirone dose of 3mg. All mini tablets were characterized for weight variation, hardness, friability, disintegration, content uniformity and dissolution studies. The optimized formulations of immediate and prolonged release mini tablets were finally combined in a capsule and was evaluated for release studies. FTIR and DSC studies were conducted to study the drug-polymer interaction. All formulations of immediate release and prolonged release mini tablets were passed all the in-process quality control tests according to US Pharmacopoeia. The disintegration time of immediate release mini tablets of different formulations was varied from 2-6 min, and maximum drug release was achieved in lesser than 60 min. Whereas prolonged release mini tablets made with TCG have shown good drug retarding properties. Formulations were controlled for about 4-10 hrs with varying concentration of TCG. As the concentration of TCG increased, the drug release retarding property also increased. The optimised mini tablets were packed in capsules and were evaluated for the release mechanism. The capsule dosage form has clearly exhibited the biphasic release of buspirone, indicating that TCG is a suitable natural polymer for this study. FTIR and DSC studies proved that there was no interaction between the drug and polymer. Based on the above positive results, it can be concluded that TCG is a suitable polymer for the biphasic drug delivery systems.Keywords: Terminalia catappa gum, biphasic release, mini tablets, tablet in capsule, natural polymers
Procedia PDF Downloads 393992 Evaluation of Air Movement, Humidity and Temperature Perceptions with the Occupant Satisfaction in Office Buildings in Hot and Humid Climate Regions by Means of Field Surveys
Authors: Diego S. Caetano, Doreen E. Kalz, Louise L. B. Lomardo, Luiz P. Rosa
Abstract:
The energy consumption in non-residential buildings in Brazil has a great impact on the national infrastructure. The growth of the energy consumption has a special role over the building cooling systems, supported by the increased people's requirements on hygrothermal comfort. This paper presents how the occupants of office buildings notice and evaluate the hygrothermic comfort regarding temperature, humidity, and air movement, considering the cooling systems presented at the buildings studied, analyzed by real occupants in areas of hot and humid climate. The paper presents results collected over a long time from 3 office buildings in the cities of Rio de Janeiro and Niteroi (Brazil) in 2015 and 2016, from daily questionnaires with eight questions answered by 114 people between 3 to 5 weeks per building, twice a day (10 a.m. and 3 p.m.). The paper analyses 6 out of 8 questions, emphasizing on the perception of temperature, humidity, and air movement. Statistics analyses were made crossing participant answers and humidity and temperature data related to time high time resolution time. Analyses were made from regressions comparing: internal and external temperature, and then compared with the answers of the participants. The results were put in graphics combining statistic graphics related to temperature and air humidity with the answers of the real occupants. Analysis related to the perception of the participants to humidity and air movements were also analyzed. The hygrothermal comfort statistic model of the European standard DIN EN 15251 and that from the Brazilian standard NBR 16401 were compared taking into account the perceptions of the hygrothermal comfort of the participants, with emphasis on air humidity, taking basis on prior studies published on this same research. The studies point out a relative tolerance for higher temperatures than the ones determined by the standards, besides a variation on the participants' perception concerning air humidity. The paper presents a group of detailed information that permits to improve the quality of the buildings based on the perception of occupants of the office buildings, contributing to the energy reduction without health damages and demands of necessary hygrothermal comfort, reducing the consumption of electricity on cooling.Keywords: thermal comfort, energy consumption, energy standards, comfort models
Procedia PDF Downloads 323991 Carbon Based Wearable Patch Devices for Real-Time Electrocardiography Monitoring
Authors: Hachul Jung, Ahee Kim, Sanghoon Lee, Dahye Kwon, Songwoo Yoon, Jinhee Moon
Abstract:
We fabricated a wearable patch device including novel patch type flexible dry electrode based on carbon nanofibers (CNFs) and silicone-based elastomer (MED 6215) for real-time ECG monitoring. There are many methods to make flexible conductive polymer by mixing metal or carbon-based nanoparticles. In this study, CNFs are selected for conductive nanoparticles because carbon nanotubes (CNTs) are difficult to disperse uniformly in elastomer compare with CNFs and silver nanowires are relatively high cost and easily oxidized in the air. Wearable patch is composed of 2 parts that dry electrode parts for recording bio signal and sticky patch parts for mounting on the skin. Dry electrode parts were made by vortexer and baking in prepared mold. To optimize electrical performance and diffusion degree of uniformity, we developed unique mixing and baking process. Secondly, sticky patch parts were made by patterning and detaching from smooth surface substrate after spin-coating soft skin adhesive. In this process, attachable and detachable strengths of sticky patch are measured and optimized for them, using a monitoring system. Assembled patch is flexible, stretchable, easily skin mountable and connectable directly with the system. To evaluate the performance of electrical characteristics and ECG (Electrocardiography) recording, wearable patch was tested by changing concentrations of CNFs and thickness of the dry electrode. In these results, the CNF concentration and thickness of dry electrodes were important variables to obtain high-quality ECG signals without incidental distractions. Cytotoxicity test is conducted to prove biocompatibility, and long-term wearing test showed no skin reactions such as itching or erythema. To minimize noises from motion artifacts and line noise, we make the customized wireless, light-weight data acquisition system. Measured ECG Signals from this system are stable and successfully monitored simultaneously. To sum up, we could fully utilize fabricated wearable patch devices for real-time ECG monitoring easily.Keywords: carbon nanofibers, ECG monitoring, flexible dry electrode, wearable patch
Procedia PDF Downloads 185990 Effects of Soaking of Maize on the Viscosity of Masa and Tortilla Physical Properties at Different Nixtamalization Times
Authors: Jorge Martínez-Rodríguez, Esther Pérez-Carrillo, Diana Laura Anchondo Álvarez, Julia Lucía Leal Villarreal, Mariana Juárez Dominguez, Luisa Fernanda Torres Hernández, Daniela Salinas Morales, Erick Heredia-Olea
Abstract:
Maize tortillas are a staple food in Mexico which are mostly made by nixtamalization, which includes the cooking and steeping of maize kernels in alkaline conditions. The cooking step in nixtamalization demands a lot of energy and also generates nejayote, a water pollutant, at the end of the process. The aim of this study was to reduce the cooking time by adding a maize soaking step before nixtamalization while maintaining the quality properties of masa and tortillas. Maize kernels were soaked for 36 h to increase moisture up to 36%. Then, the effect of different cooking times (0, 5, 10, 15, 20, 20, 25, 30, 35, 45-control and 50 minutes) was evaluated on viscosity profile (RVA) of masa to select the treatments with a profile similar or equal to control. All treatments were left steeping overnight and had the same milling conditions. Treatments selected were 20- and 25-min cooking times which had similar values for pasting temperature (79.23°C and 80.23°C), Maximum Viscosity (105.88 Cp and 96.25 Cp) and Final Viscosity (188.5 Cp and 174 Cp) to those of 45 min-control (77.65 °C, 110.08 Cp, and 186.70 Cp, respectively). Afterward, tortillas were produced with the chosen treatments (20 and 25 min) and for control, then were analyzed for texture, damage starch, colorimetry, thickness, and average diameter. Colorimetric analysis of tortillas only showed significant differences for yellow/blue coordinates (b* parameter) at 20 min (0.885), unlike the 25-minute treatment (1.122). Luminosity (L*) and red/green coordinates (a*) showed no significant differences from treatments with respect control (69.912 and 1.072, respectively); however, 25 minutes was closer in both parameters (73.390 and 1.122) than 20 minutes (74.08 and 0.884). For the color difference, (E), the 25 min value (3.84) was the most similar to the control. However, for tortilla thickness and diameter, the 20-minute with 1.57 mm and 13.12 cm respectively was closer to those of the control (1.69 mm and 13.86 cm) although smaller to it. On the other hand, the 25 min treatment tortilla was smaller than both 20 min and control with 1.51 mm thickness and 13.590 cm diameter. According to texture analyses, there was no difference in terms of stretchability (8.803-10.308 gf) and distance for the break (95.70-126.46 mm) among all treatments. However, for the breaking point, all treatments (317.1 gf and 276.5 gf for 25 and 20- min treatment, respectively) were significantly different from the control tortilla (392.2 gf). Results suggest that by adding a soaking step and reducing cooking time by 25 minutes, masa and tortillas obtained had similar functional and textural properties to the traditional nixtamalization process.Keywords: tortilla, nixtamalization, corn, lime cooking, RVA, colorimetry, texture, masa rheology
Procedia PDF Downloads 177989 Teaching Kindness as Moral Virtue in Preschool Children: The Effectiveness of Picture-Storybook Reading and Hand-Puppet Storytelling
Authors: Rose Mini Agoes Salim, Shahnaz Safitri
Abstract:
The aim of this study is to test the effectiveness of teaching kindness in preschool children by using several techniques. Kindness is a physical act or emotional support aimed to build or maintain relationships with others. Kindness is known to be essential in the development of moral reasoning to distinguish between the good and bad things. In this study, kindness is operationalized as several acts including helping friends, comforting sad friends, inviting friends to play, protecting others, sharing, saying hello, saying thank you, encouraging others, and apologizing. It is mentioned that kindness is crucial to be developed in preschool children because this is the time the children begin to interact with their social environment through play. Furthermore, preschool children's cognitive development makes them begin to represent the world with words, which then allows them to interact with others. On the other hand, preschool children egocentric thinking makes them still need to learn to consider another person's perspective. In relation to social interaction, preschool children need to be stimulated and assisted by adult to be able to pay attention to other and act with kindness toward them. On teaching kindness to children, the quality of interaction between children and their significant others is the key factor. It is known that preschool children learn about kindness by imitating adults on their two way interaction. Specifically, this study examines two types of teaching techniques that can be done by parents as a way to teach kindness, namely the picture-storybook reading and hand-puppet storytelling. These techniques were examined because both activities are easy to do and both also provide a model of behavior for the child based on the character in the story. To specifically examine those techniques effectiveness in teaching kindness, two studies were conducted. Study I involves 31 children aged 5-6 years old with picture-storybook reading technique, where the intervention is done by reading 8 picture books for 8 days. In study II, hand-puppet storytelling technique is examined to 32 children aged 3-5 years old. The treatments effectiveness are measured using an instrument in the form of nine colored cards that describe the behavior of kindness. Data analysis using Wilcoxon Signed-rank test shows a significant difference on the average score of kindness (p < 0.05) before and after the intervention has been held. For daily observation, a ‘kindness tree’ and observation sheets are used which are filled out by the teacher. Two weeks after interventions, an improvement on all kindness behaviors measured is intact. The same result is also gained from both ‘kindness tree’ and observational sheets.Keywords: kindness, moral teaching, storytelling, hand puppet
Procedia PDF Downloads 252988 Exploring Factors That May Contribute to the Underdiagnosis of Hereditary Transthyretin Amyloidosis in African American Patients
Authors: Kelsi Hagerty, Ami Rosen, Aaliyah Heyward, Nadia Ali, Emily Brown, Erin Demo, Yue Guan, Modele Ogunniyi, Brianna McDaniels, Alanna Morris, Kunal Bhatt
Abstract:
Hereditary transthyretin amyloidosis (hATTR) is a progressive, multi-systemic, and life-threatening disease caused by a disruption in the TTR protein that delivers thyroxine and retinol to the liver. This disruption causes the protein to misfold into amyloid fibrils, leading to the accumulation of the amyloid fibrils in the heart, nerves, and GI tract. Over 130 variants in the TTR gene are known to cause hATTR. The Val122Ile variant is the most common in the United States and is seen almost exclusively in people of African descent. TTR variants are inherited in an autosomal dominant fashion and have incomplete penetrance and variable expressivity. Individuals with hATTR may exhibit symptoms from as early as 30 years to as late as 80 years of age. hATTR is characterized by a wide range of clinical symptoms such as cardiomyopathy, neuropathy, carpal tunnel syndrome, and GI complications. Without treatment, hATTR leads to progressive disease and can ultimately lead to heart failure. hATTR disproportionately affects individuals of African descent; the estimated prevalence of hATTR among Black individuals in the US is 3.4%. Unfortunately, hATTR is often underdiagnosed and misdiagnosed because many symptoms of the disease overlap with other cardiac conditions. Due to the progressive nature of the disease, multi-systemic manifestations that can lead to a shortened lifespan, and the availability of free genetic testing and promising FDA-approved therapies that enhance treatability, early identification of individuals with a pathogenic hATTR variant is important, as this can significantly impact medical management for patients and their relatives. Furthermore, recent literature suggests that TTR genetic testing should be performed in all patients with suspicion of TTR-related cardiomyopathy, regardless of age, and that follow-up with genetic counseling services is recommended. Relatives of patients with hATTR benefit from genetic testing because testing can identify carriers early and allow relatives to receive regular screening and management. Despite the striking prevalence of hATTR among Black individuals, hATTR remains underdiagnosed in this patient population, and germline genetic testing for hATTR in Black individuals seems to be underrepresented, though the reasons for this have not yet been brought to light. Historically, Black patients experience a number of barriers to seeking healthcare that has been hypothesized to perpetuate the underdiagnosis of hATTR, such as lack of access and mistrust of healthcare professionals. Prior research has described a myriad of factors that shape an individual’s decision about whether to pursue presymptomatic genetic testing for a familial pathogenic variant, such as family closeness and communication, family dynamics, and a desire to inform other family members about potential health risks. This study explores these factors through 10 in-depth interviews with patients with hATTR about what factors may be contributing to the underdiagnosis of hATTR in the Black population. Participants were selected from the Emory University Amyloidosis clinic based on having a molecular diagnosis of hATTR. Interviews were recorded and transcribed verbatim, then coded using MAXQDA software. Thematic analysis was completed to draw commonalities between participants. Upon preliminary analysis, several themes have emerged. Barriers identified include i) Misdiagnosis and a prolonged diagnostic odyssey, ii) Family communication and dynamics surrounding health issues, iii) Perceptions of healthcare and one’s own health risks, and iv) The need for more intimate provider-patient relationships and communication. Overall, this study gleaned valuable insight from members of the Black community about possible factors contributing to the underdiagnosis of hATTR, as well as potential solutions to go about resolving this issue.Keywords: cardiac amyloidosis, heart failure, TTR, genetic testing
Procedia PDF Downloads 97987 Integrated Coastal Management for the Sustainable Development of Coastal Cities: The Case of El-Mina, Tripoli, Lebanon
Authors: G. Ghamrawi, Y. Abunnasr, M. Fawaz, S. Yazigi
Abstract:
Coastal cities are constantly exposed to environmental degradation and economic regression fueled by rapid and uncontrolled urban growth as well as continuous resource depletion. This is the case of the City of Mina in Tripoli (Lebanon), where lack of awareness to preserve social, ecological, and historical assets, coupled with the increasing development pressures, are threatening the socioeconomic status of the city residents, the quality of life and accessibility to the coast. To address these challenges, a holistic coastal urban design and planning approach was developed to analyze the environmental, political, legal, and socioeconomic context of the city. This approach aims to investigate the potential of balancing urban development with the protection and enhancement of cultural, ecological, and environmental assets under an integrated coastal zone management approach (ICZM). The analysis of Mina's different sectors adopted several tools that include direct field observation, interviews with stakeholders, analysis of available data, historical maps, and previously proposed projects. The findings from the analysis were mapped and graphically represented, allowing the recognition of character zones that become the design intervention units. Consequently, the thesis proposes an urban, city-scale intervention that identifies 6 different character zones (the historical fishing port, Abdul Wahab island, the abandoned Port Said, Hammam el Makloub, the sand beach, and the new developable area) and proposes context-specific design interventions that capitalize on the main characteristics of each zone. Moreover, the intervention builds on the institutional framework of ICZM as well as other studies previously conducted for the coast and adopts nature-based solutions with hybrid systems for providing better environmental design solutions for developing the coast. This enables the realization of an all-inclusive, well-connected shoreline with easy and free access towards the sea; a developed shoreline with an active local economy, and an improved urban environment.Keywords: blue green infrastructure, coastal cities, hybrid solutions, integrated coastal zone management, sustainable development, urban planning
Procedia PDF Downloads 156986 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 76985 Land Degradation Vulnerability Modeling: A Study on Selected Micro Watersheds of West Khasi Hills Meghalaya, India
Authors: Amritee Bora, B. S. Mipun
Abstract:
Land degradation is often used to describe the land environmental phenomena that reduce land’s original productivity both qualitatively and quantitatively. The study of land degradation vulnerability primarily deals with “Environmentally Sensitive Areas” (ESA) and the amount of topsoil loss due to erosion. In many studies, it is observed that the assessment of the existing status of land degradation is used to represent the vulnerability. Moreover, it is also noticed that in most studies, the primary emphasis of land degradation vulnerability is to assess its sensitivity to soil erosion only. However, the concept of land degradation vulnerability can have different objectives depending upon the perspective of the study. It shows the extent to which changes in land use land cover can imprint their effect on the land. In other words, it represents the susceptibility of a piece of land to degrade its productive quality permanently or in the long run. It is also important to mention that the vulnerability of land degradation is not a single factor outcome. It is a probability assessment to evaluate the status of land degradation and needs to consider both biophysical and human induce parameters. To avoid the complexity of the previous models in this regard, the present study has emphasized on to generate a simplified model to assess the land degradation vulnerability in terms of its current human population pressure, land use practices, and existing biophysical conditions. It is a “Mixed-Method” termed as the land degradation vulnerability index (LDVi). It was originally inspired by the MEDALUS model (Mediterranean Desertification and Land Use), 1999, and Farazadeh’s 2007 revised version of it. It has followed the guidelines of Space Application Center, Ahmedabad / Indian Space Research Organization for land degradation vulnerability. The model integrates the climatic index (Ci), vegetation index (Vi), erosion index (Ei), land utilization index (Li), population pressure index (Pi), and cover management index (CMi) by giving equal weightage to each parameter. The final result shows that the very high vulnerable zone primarily indicates three (3) prominent circumstances; land under continuous population pressure, high concentration of human settlement, and high amount of topsoil loss due to surface runoff within the study sites. As all the parameters of the model are amalgamated with equal weightage further with the help of regression analysis, the LDVi model also provides a strong grasp of each parameter and how far they are competent to trigger the land degradation process.Keywords: population pressure, land utilization, soil erosion, land degradation vulnerability
Procedia PDF Downloads 166984 Review of the Model-Based Supply Chain Management Research in the Construction Industry
Authors: Aspasia Koutsokosta, Stefanos Katsavounis
Abstract:
This paper reviews the model-based qualitative and quantitative Operations Management research in the context of Construction Supply Chain Management (CSCM). Construction industry has been traditionally blamed for low productivity, cost and time overruns, waste, high fragmentation and adversarial relationships. The construction industry has been slower than other industries to employ the Supply Chain Management (SCM) concept and develop models that support the decision-making and planning. However the last decade there is a distinct shift from a project-based to a supply-based approach of construction management. CSCM comes up as a new promising management tool of construction operations and improves the performance of construction projects in terms of cost, time and quality. Modeling the Construction Supply Chain (CSC) offers the means to reap the benefits of SCM, make informed decisions and gain competitive advantage. Different modeling approaches and methodologies have been applied in the multi-disciplinary and heterogeneous research field of CSCM. The literature review reveals that a considerable percentage of CSC modeling accommodates conceptual or process models which discuss general management frameworks and do not relate to acknowledged soft OR methods. We particularly focus on the model-based quantitative research and categorize the CSCM models depending on their scope, mathematical formulation, structure, objectives, solution approach, software used and decision level. Although over the last few years there has been clearly an increase of research papers on quantitative CSC models, we identify that the relevant literature is very fragmented with limited applications of simulation, mathematical programming and simulation-based optimization. Most applications are project-specific or study only parts of the supply system. Thus, some complex interdependencies within construction are neglected and the implementation of the integrated supply chain management is hindered. We conclude this paper by giving future research directions and emphasizing the need to develop robust mathematical optimization models for the CSC. We stress that CSC modeling needs a multi-dimensional, system-wide and long-term perspective. Finally, prior applications of SCM to other industries have to be taken into account in order to model CSCs, but not without the consequential reform of generic concepts to match the unique characteristics of the construction industry.Keywords: construction supply chain management, modeling, operations research, optimization, simulation
Procedia PDF Downloads 503983 Using Nature-Based Solutions to Decarbonize Buildings in Canadian Cities
Authors: Zahra Jandaghian, Mehdi Ghobadi, Michal Bartko, Alex Hayes, Marianne Armstrong, Alexandra Thompson, Michael Lacasse
Abstract:
The Intergovernmental Panel on Climate Change (IPCC) report stated the urgent need to cut greenhouse gas emissions to avoid the adverse impacts of climatic changes. The United Nations has forecasted that nearly 70 percent of people will live in urban areas by 2050 resulting in a doubling of the global building stock. Given that buildings are currently recognised as emitting 40 percent of global carbon emissions, there is thus an urgent incentive to decarbonize existing buildings and to build net-zero carbon buildings. To attain net zero carbon emissions in communities in the future requires action in two directions: I) reduction of emissions; and II) removal of on-going emissions from the atmosphere once de-carbonization measures have been implemented. Nature-based solutions (NBS) have a significant role to play in achieving net zero carbon communities, spanning both emission reductions and removal of on-going emissions. NBS for the decarbonisation of buildings can be achieved by using green roofs and green walls – increasing vertical and horizontal vegetation on the building envelopes – and using nature-based materials that either emit less heat to the atmosphere thus decreasing photochemical reaction rates, or store substantial amount of carbon during the whole building service life within their structure. The NBS approach can also mitigate urban flooding and overheating, improve urban climate and air quality, and provide better living conditions for the urban population. For existing buildings, de-carbonization mostly requires retrofitting existing envelopes efficiently to use NBS techniques whereas for future construction, de-carbonization involves designing new buildings with low carbon materials as well as having the integrity and system capacity to effectively employ NBS. This paper presents the opportunities and challenges in respect to the de-carbonization of buildings using NBS for both building retrofits and new construction. This review documents the effectiveness of NBS to de-carbonize Canadian buildings, identifies the missing links to implement these techniques in cold climatic conditions, and determine a road map and immediate approaches to mitigate the adverse impacts of climate change such as urban heat islanding. Recommendations are drafted for possible inclusion in the Canadian building and energy codes.Keywords: decarbonization, nature-based solutions, GHG emissions, greenery enhancement, buildings
Procedia PDF Downloads 93982 Factory Communication System for Customer-Based Production Execution: An Empirical Study on the Manufacturing System Entropy
Authors: Nyashadzashe Chiraga, Anthony Walker, Glen Bright
Abstract:
The manufacturing industry is currently experiencing a paradigm shift into the Fourth Industrial Revolution in which customers are increasingly at the epicentre of production. The high degree of production customization and personalization requires a flexible manufacturing system that will rapidly respond to the dynamic and volatile changes driven by the market. They are a gap in technology that allows for the optimal flow of information and optimal manufacturing operations on the shop floor regardless of the rapid changes in the fixture and part demands. Information is the reduction of uncertainty; it gives meaning and context on the state of each cell. The amount of information needed to describe cellular manufacturing systems is investigated by two measures: the structural entropy and the operational entropy. Structural entropy is the expected amount of information needed to describe scheduled states of a manufacturing system. While operational entropy is the amount of information that describes the scheduled states of a manufacturing system, which occur during the actual manufacturing operation. Using Anylogic simulator a typical manufacturing job shop was set-up with a cellular manufacturing configuration. The cellular make-up of the configuration included; a Material handling cell, 3D Printer cell, Assembly cell, manufacturing cell and Quality control cell. The factory shop provides manufactured parts to a number of clients, and there are substantial variations in the part configurations, new part designs are continually being introduced to the system. Based on the normal expected production schedule, the schedule adherence was calculated from the structural entropy and operation entropy of varying the amounts of information communicated in simulated runs. The structural entropy denotes a system that is in control; the necessary real-time information is readily available to the decision maker at any point in time. For contractive analysis, different out of control scenarios were run, in which changes in the manufacturing environment were not effectively communicated resulting in deviations in the original predetermined schedule. The operational entropy was calculated from the actual operations. From the results obtained in the empirical study, it was seen that increasing, the efficiency of a factory communication system increases the degree of adherence of a job to the expected schedule. The performance of downstream production flow fed from the parallel upstream flow of information on the factory state was increased.Keywords: information entropy, communication in manufacturing, mass customisation, scheduling
Procedia PDF Downloads 245981 Analyzing the Commentator Network Within the French YouTube Environment
Authors: Kurt Maxwell Kusterer, Sylvain Mignot, Annick Vignes
Abstract:
To our best knowledge YouTube is the largest video hosting platform in the world. A high number of creators, viewers, subscribers and commentators act in this specific eco-system which generates huge sums of money. Views, subscribers, and comments help to increase the popularity of content creators. The most popular creators are sponsored by brands and participate in marketing campaigns. For a few of them, this becomes a financially rewarding profession. This is made possible through the YouTube Partner Program, which shares revenue among creators based on their popularity. We believe that the role of comments in increasing the popularity is to be emphasized. In what follows, YouTube is considered as a bilateral network between the videos and the commentators. Analyzing a detailed data set focused on French YouTubers, we consider each comment as a link between a commentator and a video. Our research question asks what are the predominant features of a video which give it the highest probability to be commented on. Following on from this question, how can we use these features to predict the action of the agent in commenting one video instead of another, considering the characteristics of the commentators, videos, topics, channels, and recommendations. We expect to see that the videos of more popular channels generate higher viewer engagement and thus are more frequently commented. The interest lies in discovering features which have not classically been considered as markers for popularity on the platform. A quick view of our data set shows that 96% of the commentators comment only once on a certain video. Thus, we study a non-weighted bipartite network between commentators and videos built on the sub-sample of 96% of unique comments. A link exists between two nodes when a commentator makes a comment on a video. We run an Exponential Random Graph Model (ERGM) approach to evaluate which characteristics influence the probability of commenting a video. The creation of a link will be explained in terms of common video features, such as duration, quality, number of likes, number of views, etc. Our data is relevant for the period of 2020-2021 and focuses on the French YouTube environment. From this set of 391 588 videos, we extract the channels which can be monetized according to YouTube regulations (channels with at least 1000 subscribers and more than 4000 hours of viewing time during the last twelve months).In the end, we have a data set of 128 462 videos which consist of 4093 channels. Based on these videos, we have a data set of 1 032 771 unique commentators, with a mean of 2 comments per a commentator, a minimum of 1 comment each, and a maximum of 584 comments.Keywords: YouTube, social networks, economics, consumer behaviour
Procedia PDF Downloads 68980 The Impact of Climate Change on Typical Material Degradation Criteria over Timurid Historical Heritage
Authors: Hamed Hedayatnia, Nathan Van Den Bossche
Abstract:
Understanding the ways in which climate change accelerates or slows down the process of material deterioration is the first step towards assessing adaptive approaches for the conservation of historical heritage. Analysis of the climate change effects on the degradation risk assessment parameters like freeze-thaw cycles and wind erosion is also a key parameter when considering mitigating actions. Due to the vulnerability of cultural heritage to climate change, the impact of this phenomenon on material degradation criteria with the focus on brick masonry walls in Timurid heritage, located in Iran, was studied. The Timurids were the final great dynasty to emerge from the Central Asian steppe. Through their patronage, the eastern Islamic world in northwestern of Iran, especially in Mashhad and Herat, became a prominent cultural center. Goharshad Mosque is a mosque in Mashhad of the Razavi Khorasan Province, Iran. It was built by order of Empress Goharshad, the wife of Shah Rukh of the Timurid dynasty in 1418 CE. Choosing an appropriate regional climate model was the first step. The outputs of two different climate model: the 'ALARO-0' and 'REMO,' were analyzed to find out which model is more adopted to the area. For validating the quality of the models, a comparison between model data and observations was done in 4 different climate zones in Iran for a period of 30 years. The impacts of the projected climate change were evaluated until 2100. To determine the material specification of Timurid bricks, standard brick samples from a Timurid mosque were studied. Determination of water absorption coefficient, defining the diffusion properties and determination of real density, and total porosity tests were performed to characterize the specifications of brick masonry walls, which is needed for running HAM-simulations. Results from the analysis showed that the threatening factors in each climate zone are almost different, but the most effective factor around Iran is the extreme temperature increase and erosion. In the north-western region of Iran, one of the key factors is wind erosion. In the north, rainfall erosion and mold growth risk are the key factors. In the north-eastern part, in which our case study is located, the important parameter is wind erosion.Keywords: brick, climate change, degradation criteria, heritage, Timurid period
Procedia PDF Downloads 119979 Conflict around the Brownfield Reconversion of the Canadian Forces Base Rockcliffe in Ottawa: A Clash of Ambitions and Visions in Canadian Urban Sustainability
Authors: Kenza Benali
Abstract:
Over the past decade, a number of remarkable projects in urban brownfield reconversion emerged across Canada, including the reconversion of former military bases owned by the Canada Lands Company (CLC) into sustainable communities. However, unlike other developments, the regeneration project of the former Canadian Forces Base Rockcliffe in Ottawa – which was announced as one of the most ambitious Smart growth projects in Canada – faced serious obstacles in terms of social acceptance by the local community, particularly urban minorities composed of Francophones, Indigenous and vulnerable groups who live near or on the Base. This turn of events led to the project being postponed and even reconsidered. Through an analysis of its press coverage, this research aims to understand the causes of this urban conflict which lasted for nearly ten years. The findings reveal that the conflict is not limited to the “standard” issues common to most conflicts related to urban mega-projects in the world – e.g., proximity issues (threads to the quality of the surrounding neighbourhoods; noise, traffic, pollution, New-build gentrification) often associated with NIMBY phenomena. In this case, the local actors questioned the purpose of the project (for whom and for what types of uses is it conceived?), its local implementation (to what extent are the local history and existing environment taken into account?), and the degree of implication of the local population in the decision-making process (with whom is the project built?). Moreover, the interests of the local actors have “jumped scales” and transcend the micro-territorial level of their daily life to take on a national and even international dimension. They defined an alternative view of how this project, considered strategic by his location in the nation’s capital, should be a reference as well as an international showcase of Canadian ambition and achievement in terms of urban sustainability. This vision promoted, actually, a territorial and national identity approach - in which some cultural values are highly significant (respect of social justice, inclusivity, ethnical diversity, cultural heritage, etc.)- as a counterweight to planners’ vision which is criticized as a normative/ universalist logic that ignore the territorial peculiarities.Keywords: smart growth, brownfield reconversion, sustainable neighborhoods, Canada Lands Company, Canadian Forces Base Rockcliffe, urban conflicts
Procedia PDF Downloads 382978 Examination of Teacher Candidates Attitudes Towards Disabled Individuals Employment in terms of Various Variables
Authors: Tuna Şahsuvaroğlu
Abstract:
The concept of disability is a concept that has been the subject of many studies in national and international literature with its social, sociological, political, anthropological, economic and social dimensions as well as with individual and social consequences. A disabled person is defined as a person who has difficulties in adapting to social life and meeting daily needs due to loss of physical, mental, spiritual, sensory and social abilities to various degrees, either from birth or for any reason later, and they are in need of protection, care, rehabilitation, counseling and support services. The industrial revolution and the rapid industrialization it brought with it led to an increase in the rate of disabilities resulting from work accidents, in addition to congenital disabilities. This increase has resulted in disabled people included in the employment policies of nations as a disadvantaged group. Although the participation of disabled individuals in the workforce is of great importance in terms of both increasing their quality of life and their integration with society and although disabled individuals are willing to participate in the workforce, they encounter with many problems. One of these problems is the negative attitudes and prejudices that develop in society towards the employment of disabled individuals. One of the most powerful ways to turn these negative attitudes and prejudices into positive ones is education. Education is a way of guiding societies and transferring existing social characteristics to future generations. This can be maintained thanks to teachers, who are one of the most dynamic parts of society and act as the locomotive of education driven by the need to give direction and transfer and basically to help and teach. For this reason, there is a strong relationship between the teaching profession and the attitudes formed in society towards the employment of disabled individuals, as they can influence each other. Therefore, the purpose of this study is to examine teacher candidates' attitudes towards the employment of disabled individuals in terms of various variables. The participants of the study consist of 665 teacher candidates studying at various departments at Marmara University Faculty of Education in the 2022-2023 academic year. The descriptive survey model of the general survey model was used in this study as it intends to determine the attitudes of teacher candidates towards the employment of disabled individuals in terms of different variables. The Attitude Scale Towards Employment of Disabled People was used to collect data. The data were analyzed according to the variables of age, gender, marital status, the department, and whether there is a disabled relative in the family, and the findings were discussed in the context of further research.Keywords: teacher candidates, disabled, attitudes towards the employment of disabled people, attitude scale towards the employment of disabled people
Procedia PDF Downloads 65977 Conjugated Linoleic Acid Effect on Body Weight and Body Composition in Women: Systematic Review and Meta-Analysis
Authors: Hanady Hamdallah, H. Elyse Ireland, John H. H. Williams
Abstract:
Conjugated linoleic acid (CLA) is a food supplement that is reported to have multiple beneficial health effects, including being anti-carcinogenic, anti-inflammatory and anti-obesity. Animal studies have shown a significant anti-obesity effect of CLA, but results in humans were inconsistent, where some of the studies found an anti-obesity effect while other studies failed to find any decline in obesity markers after CLA supplementation. This meta-analysis aimed to determine if oral CLA supplementation has been shown to reduce obesity related markers in women. Pub Med, Cochrane Library, and Google Scholar were used to identify the eligible trials using two main searching strategies: the first one was to search eligible trials using keywords 'Conjugated linoleic acid', 'CLA', 'Women', and the second strategy was to extract the eligible trials from previously published systematic reviews and meta-analyses. The eligible trials were placebo control trials where women supplemented with CLA mixture in the form of oral capsules for 6 months or less. Also, these trials provided information about body composition expressed as body weight (BW), body mass index (BMI), total body fat (TBF), percentage body fat (BF %), and/ or lean body mass (LBM). The quality of each included study was assessed using both JADAD scale and an adapted CONSERT checklist. Meta-analysis of 8 eligible trials showed that CLA supplementation was significantly associated with reduced BW (Mean ± SD, 1.2 ± 0.26 kg, p < 0.001), BMI (0.6 ± 0.13kg/m², p < 0.001) and TBF (0.76 ± 0.26 kg, p= 0.003) in women, when supplemented over 6-16 weeks. Subgroup meta-analysis demonstrated a significant reduction in BW (1.29 ± 0.31 kg, p < 0.001), BMI (0.60 ± 0.14 kg/m², p < 0.001) and TBF (0.82 ± 0.28 kg, p= 0.003) in the trials that had recruited overweight-obese women. The second subgroup meta-analysis, that considered the menopausal status of the participants, found that CLA was significantly associated with reduced BW (1.35 ± 0.37 kg, p < 0.001; 1.05 ± 0.36 kg, p= 0.003) and BMI (0.50 ± 0.17 kg/m², p= 0.003; 0.75 ± 0.2 kg/m², p < 0.001) in both pre and post-menopausal age women, respectively. A reduction in TBF (1.09 ± 0.37 kg, p= 0.003) was only significant in post-menopausal women. Interestingly, CLA supplementation was associated with a significant reduction in BW (1.05 ± 0.35 kg, p< 0.003), BMI (0.73 ± 0.2 kg/m², p < 0.001) and TBF (1.07 ± 0.36 kg, p= 0.003) in the trials without lifestyle monitoring or interventions. No significant effect of CLA on LBM was detected in this meta-analysis. This meta-analysis suggests a moderate anti-obesity effect of CLA on BW, BMI and TBF reduction in women, when supplemented over 6-16 weeks, particularly in overweight-obese women and post-menopausal women. However, this finding requires careful interpretation due to several issues in the designs of available CLA supplementation trials. More well-designed trials are required to confirm this meta-analysis results.Keywords: body composition, body mass index, body weight, conjugated linoleic acid
Procedia PDF Downloads 294976 CRISPR-Mediated Genome Editing for Yield Enhancement in Tomato
Authors: Aswini M. S.
Abstract:
Tomato (Solanum lycopersicum L.) is one of the most significant vegetable crops in terms of its economic benefits. Both fresh and processed tomatoes are consumed. Tomatoes have a limited genetic base, which makes breeding extremely challenging. Plant breeding has become much simpler and more effective with genome editing tools of CRISPR and CRISPR-associated 9 protein (CRISPR/Cas9), which address the problems with traditional breeding, chemical/physical mutagenesis, and transgenics. With the use of CRISPR/Cas9, a number of tomato traits have been functionally distinguished and edited. These traits include plant architecture as well as flower characters (leaf, flower, male sterility, and parthenocarpy), fruit ripening, quality and nutrition (lycopene, carotenoid, GABA, TSS, and shelf-life), disease resistance (late blight, TYLCV, and powdery mildew), tolerance to abiotic stress (heat, drought, and salinity) and resistance to herbicides. This study explores the potential of CRISPR/Cas9 genome editing for enhancing yield in tomato plants. The study utilized the CRISPR/Cas9 genome editing technology to functionally edit various traits in tomatoes. The de novo domestication of elite features from wild cousins to cultivated tomatoes and vice versa has been demonstrated by the introgression of CRISPR/Cas9. The CycB (Lycopene beta someri) gene-mediated Cas9 editing increased the lycopene content in tomato. Also, Cas9-mediated editing of the AGL6 (Agamous-like 6) gene resulted in parthenocarpic fruit development under heat-stress conditions. The advent of CRISPR/Cas has rendered it possible to use digital resources for single guide RNA design and multiplexing, cloning (such as Golden Gate cloning, GoldenBraid, etc.), creating robust CRISPR/Cas constructs, and implementing effective transformation protocols like the Agrobacterium and DNA free protoplast method for Cas9-gRNAs ribonucleoproteins (RNPs) complex. Additionally, homologous recombination (HR)-based gene knock-in (HKI) via geminivirus replicon and base/prime editing (Target-AID technology) remains possible. Hence, CRISPR/Cas facilitates fast and efficient breeding in the improvement of tomatoes.Keywords: CRISPR-Cas, biotic and abiotic stress, flower and fruit traits, genome editing, polygenic trait, tomato and trait introgression
Procedia PDF Downloads 70975 Factors Impacting Training and Adult Education Providers’ Business Performance: The Singapore Context
Abstract:
The SkillsFuture Singapore’s mission to develop a responsive and forward-looking Training and Adult Education (TAE) and workforce development system is undergirded by how successful TAE providers are in their business performance and strategies that strengthen their operational efficiency and processes. Therefore, understanding the factors that drive the business performance of TAE providers is critical to the success of SkillsFuture Singapore’s initiatives. This study aims to investigate how business strategy, work autonomy, work intensity and professional development support impact the business performance of private TAE providers. Specifically, the three research questions are: (1) Are there significant relationships between the above-mentioned four factors and TAE providers’ business performance?; (2) Are there significant differences on the four factors between low and high TAE providers’ business performance groups?; and (3) To what extent and in what manner do the four factors predict TAE providers’ business performance? This was part of the first national study on organizations and professionals working in the Training and Adult Education (TAE) sector. Data from 265 private TAE providers where respondents were Chief Executive Officers representatives from the Senior Management were analyzed. The results showed that business strategy (the extent that the organization leads the way in terms of developing new products and services; uses up-to-date learning technologies; customizes its products and services to the client’s needs), work autonomy (the extent that the staff personally have an influence on how hard they work; deciding what tasks they are to do; deciding how they are to do the tasks, and deciding the quality standards to which they work) and professional development support (both monetary and non-monetary support and incentives) had positive and significant relationships with business performance. However, no significant relationship is found between work intensity and business performance. A business strategy, work autonomy and professional development support were significantly higher in the high business performance group compared to the low-performance group among the TAE providers. Results of hierarchical regression analyses controlling for the size of the TAE providers showed significant impacts of business strategy, work autonomy and professional development support on TAE providers’ business performance. Overall, the model accounted for 27% of the variance in TAE providers’ business performance. This study provides policymakers with insights into improving existing policies, designing new initiatives and implementing targeting interventions to support TAE providers. The findings also have implications on how the TAE providers could better formulate their organizational strategies and business models. Finally, limitations of study, along with directions for future research will be discussed in the paper.Keywords: adult education, business performance, business strategy, training, work autonomy
Procedia PDF Downloads 208974 Thoughts Regarding Interprofessional Work between Nurses and Speech-Language-Hearing Therapists in Cancer Rehabilitation: An Approach for Dysphagia
Authors: Akemi Nasu, Keiko Matsumoto
Abstract:
Rehabilitation for cancer requires setting up individual goals for each patient and an approach that properly fits the stage of cancer when putting into practice. In order to cope with the daily changes in the patients' condition, the establishment of a good cooperative relationship between the nurses and the physiotherapists, occupational therapists, and speech-language-hearing therapists (therapists) becomes essential. This study will focus on the present situation of the cooperation between nurses and therapists, especially the speech-language-hearing therapists, and aim to elucidate what develops there. A semi-structured interview was conducted targeted at a physical therapist having practical experience in working in collaboration with nurses. The contents of the interview were transcribed and converted to data, and the data was encoded and categorized with sequentially increasing degrees of abstraction to conduct a qualitative explorative factor analysis of the data. When providing ethical explanations, particular care was taken to ensure that participants would not be subjected to any disadvantages as a result of participating in the study. In addition, they were also informed that their privacy would be ensured and that they have the right to decline to participate in the study. In addition, they were also informed that the results of the study would be announced publicly at an applicable nursing academic conference. This study has been approved following application to the ethical committee of the university with which the researchers are affiliated. The survey participant is a female speech-language-hearing therapist in her forties. As a result of the analysis, 6 categories were extracted consisting of 'measures to address appetite and aspiration pneumonia prevention', 'limitation of the care a therapist alone could provide', 'the all-inclusive patient- supportive care provided by nurses', 'expand the beneficial cooperation with nurses', 'providing education for nurses on the swallowing function utilizing videofluoroscopic examination of swallowing', 'enhancement of communication including conferences'. In order to improve the team performance, and for the teamwork competency necessary for the provision of safer care, mutual support is essential. As for the cooperation between nurses and therapists, this survey indicates that the maturing of the cooperation between professionals in order to improve nursing professionals' knowledge and enhance communication will lead to an improvement in the quality of the rehabilitation for cancer.Keywords: cancer rehabilitation, nurses, speech-language-hearing therapists, interprofessional work
Procedia PDF Downloads 133973 Assessing In-Country Public Health Training Needs: Workforce Development to Meet Sustainable Development Goals
Authors: Leena Inamdar, David Allen, Sushma Acquilla, James Gore
Abstract:
Health systems globally are facing increasingly complex challenges. Emerging health threats, changing population demographics and increasing health inequalities, globalisation, economic constraints on government spending are some of the most critical ones. These challenges demand not only innovative funding and cross-sectoral approaches, but also require a multidisciplinary public health workforce equipped with skills and expertise to meet the future challenges of the Sustainable Development Goals (SDGs). We aim to outline an approach to assessing the feasibility of establishing a competency-based public health training at a country level. Although the SDGs provide an enabling impetus for change and promote positive developments, public health training and education still lag behind. Large gaps are apparent in both the numbers of trained professionals and the options for high quality training. Public health training in most Low-Middle Income Countries is still largely characterized by a traditional and limited public health focus. There is a pressing need to review and develop core and emerging competences for a well-equipped workforce fit for the future. This includes the important role of national Health and Human Resource Ministries in determining these competences. Public health has long been recognised as a multidisciplinary field, with need for professionals from a wider range of disciplines such as management, health promotion, health economics, law. Leadership and communication skills are also critical to achieve the successes in meeting public health outcomes. Such skills and competences need to be translated into competency-based training and education, to prepare current public health professionals with the skills required in today’s competitive job market. Integration of academic and service based public-health training, flexible accredited programmes to support existing mid-career professionals, continuous professional development need to be explored. In the current global climate of austerity and increasing demands on health systems, the need for stepping up public health training and education is more important than ever. By using a case study, we demonstrate the process of assessing the in-county capacity to establish a competency based public health training programme that will help to develop a stronger, more versatile and much needed public health workforce to meet the SDGs.Keywords: public health training, competency-based, assessment, SDGs
Procedia PDF Downloads 201972 Double Burden of Malnutrition among Children under Five in Sub-Saharan Africa and Other Least Developed Countries: A Systematic Review
Authors: Getenet Dessie, Jinhu Li, Son Nghiem, Tinh Doan
Abstract:
Background: Concerns regarding malnutrition have evolved from focusing solely on single forms to addressing the simultaneous occurrence of multiple types, commonly referred to as the double or triple burden of malnutrition. Nevertheless, data concerning the concurrent occurrence of various types of malnutrition are scarce. Therefore, this systematic review and meta-analysis aims to assess the pooled prevalence of the double burden of malnutrition among children under five in Sub-Saharan Africa and other least-developed countries (LDCs). Methods: Electronic, web-based searches were conducted from January 15 to June 28, 2023, across several databases, including PubMed, Embase, Google Scholar, and the World Health Organization's Hinari portal, as well as other search engines, to identify primary studies published up to June 28, 2023. Laboratory-based cross-sectional studies on children under the age of five were included. Two independent authors assessed the risk of bias and the quality of the identified articles. The primary outcomes of this study were micronutrient deficiencies and the comorbidity of stunting and anemia, as well as wasting and anemia. The random-effects model was utilized for analysis. The association of identified variables with the various forms of malnutrition was also assessed using adjusted odds ratios (AOR) with a 95% confidence interval (CI). This review was registered in PROSPERO with the reference number CRD42023409483. Findings: The electronic search generated 6,087 articles, 93 of which matched the inclusion criteria for the final meta-analysis. Micronutrient deficiencies were prevalent among children under five in Sub-Saharan Africa and other LDCs, with rates ranging from 16.63% among 25,169 participants for vitamin A deficiency to 50.90% among 3,936 participants for iodine deficiency. Iron deficiency anemia affected 20.56% of the 63,121 participants. The combined prevalence of wasting anemia and stunting anemia was 5.41% among 64,709 participants and 19.98% among 66,016 participants, respectively. Both stunting and vitamin A supplementation were associated with vitamin A and iron deficiencies, with adjusted odds ratios (AOR) of 1.54 (95% CI: 1.01, 2.37) and 1.37 (95% CI: 1.21, 1.55), respectively. Interpretation: The prevalence of the double burden of malnutrition among children under the age of five was notably high in Sub-Saharan Africa and other LDCs. These findings indicate a need for increased attention and a focus on understanding the factors influencing this double burden of malnutrition.Keywords: children, Sub-Saharan Africa, least developed countries, double burden of malnutrition, systematic review, meta-analysis
Procedia PDF Downloads 81971 Monitoring Memories by Using Brain Imaging
Authors: Deniz Erçelen, Özlem Selcuk Bozkurt
Abstract:
The course of daily human life calls for the need for memories and remembering the time and place for certain events. Recalling memories takes up a substantial amount of time for an individual. Unfortunately, scientists lack the proper technology to fully understand and observe different brain regions that interact to form or retrieve memories. The hippocampus, a complex brain structure located in the temporal lobe, plays a crucial role in memory. The hippocampus forms memories as well as allows the brain to retrieve them by ensuring that neurons fire together. This process is called “neural synchronization.” Sadly, the hippocampus is known to deteriorate often with age. Proteins and hormones, which repair and protect cells in the brain, typically decline as the age of an individual increase. With the deterioration of the hippocampus, an individual becomes more prone to memory loss. Many memory loss starts off as mild but may evolve into serious medical conditions such as dementia and Alzheimer’s disease. In their quest to fully comprehend how memories work, scientists have created many different kinds of technology that are used to examine the brain and neural pathways. For instance, Magnetic Resonance Imaging - or MRI- is used to collect detailed images of an individual's brain anatomy. In order to monitor and analyze brain functions, a different version of this machine called Functional Magnetic Resonance Imaging - or fMRI- is used. The fMRI is a neuroimaging procedure that is conducted when the target brain regions are active. It measures brain activity by detecting changes in blood flow associated with neural activity. Neurons need more oxygen when they are active. The fMRI measures the change in magnetization between blood which is oxygen-rich and oxygen-poor. This way, there is a detectable difference across brain regions, and scientists can monitor them. Electroencephalography - or EEG - is also a significant way to monitor the human brain. The EEG is more versatile and cost-efficient than an fMRI. An EEG measures electrical activity which has been generated by the numerous cortical layers of the brain. EEG allows scientists to be able to record brain processes that occur after external stimuli. EEGs have a very high temporal resolution. This quality makes it possible to measure synchronized neural activity and almost precisely track the contents of short-term memory. Science has come a long way in monitoring memories using these kinds of devices, which have resulted in the inspections of neurons and neural pathways becoming more intense and detailed.Keywords: brain, EEG, fMRI, hippocampus, memories, neural pathways, neurons
Procedia PDF Downloads 85970 Experimental Study on Heat and Mass Transfer of Humidifier for Fuel Cell
Authors: You-Kai Jhang, Yang-Cheng Lu
Abstract:
Major contributions of this study are threefold: designing a new model of planar-membrane humidifier for Proton Exchange Membrane Fuel Cell (PEMFC), an index to measure the Effectiveness (εT) of that humidifier, and an air compressor system to replicate related planar-membrane humidifier experiments. PEMFC as a kind of renewable energy has become more and more important in recent years due to its reliability and durability. To maintain the efficiency of the fuel cell, the membrane of PEMFC need to be controlled in a good hydration condition. How to maintain proper membrane humidity is one of the key issues to optimize PEMFC. We developed new humidifier to recycle water vapor from cathode air outlet so as to keep the moisture content of cathode air inlet in a PEMFC. By measuring parameters such as dry side air outlet dew point temperature, dry side air inlet temperature and humidity, wet side air inlet temperature and humidity, and differential pressure between dry side and wet side, we calculated indices obtained by dew point approach temperature (DPAT), water flux (J), water recovery ratio (WRR), effectiveness (εT), and differential pressure (ΔP). We discussed six topics including sealing effect, flow rate effect, flow direction effect, channel effect, temperature effect, and humidity effect by using these indices. Gas cylinders are used as sources of air supply in many studies of humidifiers. Gas cylinder depletes quickly during experiment at 1kW air flow rate, and it causes replication difficult. In order to ensure high stable air quality and better replication of experimental data, this study designs an air supply system to overcome this difficulty. The experimental result shows that the best rate of pressure loss of humidifier is 0.133×10³ Pa(g)/min at the torque of 25 (N.m). The best humidifier performance ranges from 30-40 (LPM) of air flow rates. The counter flow configured humidifies moisturizes the dry side inlet air more effectively than the parallel flow humidifier. From the performance measurements of the channel plates various rib widths studied in this study, it is found that the narrower the rib width is, the more the performance of humidifier improves. Raising channel width in same hydraulic diameter (Dh ) will obtain higher εT and lower ΔP. Moreover, increasing the dry side air inlet temperature or humidity will lead to lower εT. In addition, when the dry side air inlet temperature exceeds 50°C, the effect becomes even more obvious.Keywords: PEM fuel cell, water management, membrane humidifier, heat and mass transfer, humidifier performance
Procedia PDF Downloads 176