Search results for: radio frequency integrated circuit (RFIC) baseband
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7593

Search results for: radio frequency integrated circuit (RFIC) baseband

933 Assessing Project Performance through Work Sampling and Earned Value Analysis

Authors: Shobha Ramalingam

Abstract:

The majority of the infrastructure projects are affected by time overrun, resulting in project delays and subsequently cost overruns. Time overrun may vary from a few months to as high as five or more years, placing the project viability at risk. One of the probable reasons noted in the literature for this outcome in projects is due to poor productivity. Researchers contend that productivity in construction has only marginally increased over the years. While studies in the literature have extensively focused on time and cost parameters in projects, there are limited studies that integrate time and cost with productivity to assess project performance. To this end, a study was conducted to understand the project delay factors concerning cost, time and productivity. A case-study approach was adopted to collect rich data from a nuclear power plant project site for two months through observation, interviews and document review. The data were analyzed using three different approaches for a comprehensive understanding. Foremost, a root-cause analysis was performed on the data using Ishikawa’s fish-bone diagram technique to identify the various factors impacting the delay concerning time. Based on it, a questionnaire was designed and circulated to concerned executives, including project engineers and contractors to determine the frequency of occurrence of the delay, which was then compiled and presented to the management for a possible solution to mitigate. Second, a productivity analysis was performed on select activities, including rebar bending and concreting through a time-motion study to analyze product performance. Third, data on cost of construction for three years allowed analyzing the cost performance using earned value management technique. All three techniques allowed to systematically and comprehensively identify the key factors that deter project performance and productivity loss in the construction of the nuclear power plant project. The findings showed that improper planning and coordination between multiple trades, concurrent operations, improper workforce and material management, fatigue due to overtime were some of the key factors that led to delays and poor productivity. The findings are expected to act as a stepping stone for further research and have implications for practitioners.

Keywords: earned value analysis, time performance, project costs, project delays, construction productivity

Procedia PDF Downloads 90
932 Food Sharing App and the Ubuntu Ssharing Economy: Accessing the Impact of Technology of Food Waste Reduction

Authors: Gabriel Sunday Ayayia

Abstract:

Food waste remains a critical global challenge with significant environmental, economic, and ethical implications. In an era where food waste and food insecurity coexist, innovative technology-driven solutions have emerged, aiming to bridge the gap between surplus food and those in need. Simultaneously, disparities in food access persist, exacerbating issues of hunger and malnutrition. Emerging food-sharing apps offer a promising avenue to mitigate these problems but require further examination within the context of the Ubuntu sharing economy. This study seeks to understand the impact of food-sharing apps, guided by the principles of Ubuntu, on reducing food waste and enhancing food access. The study examines how specific food-sharing apps within the Ubuntu sharing economy could contribute to fostering community resilience and reducing food waste. Ubuntu underscores the idea that we are all responsible for the well-being of our community members. In the context of food waste, this means that individuals and businesses have a collective responsibility to ensure that surplus food is shared rather than wasted. Food-sharing apps align with this principle by facilitating the sharing of excess food with those in need, transforming waste into a communal resource. This research employs a mixed-methods approach of both quantitative analysis and qualitative inquiry. Large-scale surveys will be conducted to assess user behavior, attitudes, and experiences with food-sharing apps, focusing on the frequency of use, motivations, and perceived impacts. Qualitative interviews with app users, community organizers, and stakeholders will explore the Ubuntu-inspired aspects of food-sharing apps and their influence on reducing food waste and improving food access. Quantitative data will be analyzed using statistical techniques, while qualitative data will undergo thematic analysis to identify key patterns and insights. This research addresses a critical gap in the literature by examining the role of food-sharing apps in reducing food waste and enhancing food access, particularly within the Ubuntu sharing economy framework. Findings will offer valuable insights for policymakers, technology developers, and communities seeking to leverage technology to create a more just and sustainable food system.

Keywords: sharing economy, food waste reduction, technology, community- based approach

Procedia PDF Downloads 58
931 Human Lens Metabolome: A Combined LC-MS and NMR Study

Authors: Vadim V. Yanshole, Lyudmila V. Yanshole, Alexey S. Kiryutin, Timofey D. Verkhovod, Yuri P. Tsentalovich

Abstract:

Cataract, or clouding of the eye lens, is the leading cause of vision impairment in the world. The lens tissue have very specific structure: It does not have vascular system, the lens proteins – crystallins – do not turnover throughout lifespan. The protection of lens proteins is provided by the metabolites which diffuse inside the lens from the aqueous humor or synthesized in the lens epithelial layer. Therefore, the study of changes in the metabolite composition of a cataractous lens as compared to a normal lens may elucidate the possible mechanisms of the cataract formation. Quantitative metabolomic profiles of normal and cataractous human lenses were obtained with the combined use of high-frequency nuclear magnetic resonance (NMR) and ion-pairing high-performance liquid chromatography with high-resolution mass-spectrometric detection (LC-MS) methods. The quantitative content of more than fifty metabolites has been determined in this work for normal aged and cataractous human lenses. The most abundant metabolites in the normal lens are myo-inositol, lactate, creatine, glutathione, glutamate, and glucose. For the majority of metabolites, their levels in the lens cortex and nucleus are similar, with the few exceptions including antioxidants and UV filters: The concentrations of glutathione, ascorbate and NAD in the lens nucleus decrease as compared to the cortex, while the levels of the secondary UV filters formed from primary UV filters in redox processes increase. That confirms that the lens core is metabolically inert, and the metabolic activity in the lens nucleus is mostly restricted by protection from the oxidative stress caused by UV irradiation, UV filter spontaneous decomposition, or other factors. It was found that the metabolomic composition of normal and age-matched cataractous human lenses differ significantly. The content of the most important metabolites – antioxidants, UV filters, and osmolytes – in the cataractous nucleus is at least ten fold lower than in the normal nucleus. One may suppose that the majority of these metabolites are synthesized in the lens epithelial layer, and that age-related cataractogenesis might originate from the dysfunction of the lens epithelial cells. Comprehensive quantitative metabolic profiles of the human eye lens have been acquired for the first time. The obtained data can be used for the analysis of changes in the lens chemical composition occurring with age and with the cataract development.

Keywords: cataract, lens, NMR, LC-MS, metabolome

Procedia PDF Downloads 304
930 Integrated Geophysical Surveys for Sinkhole and Subsidence Vulnerability Assessment, in the West Rand Area of Johannesburg

Authors: Ramoshweu Melvin Sethobya, Emmanuel Chirenje, Mihlali Hobo, Simon Sebothoma

Abstract:

The recent surge in residential infrastructure development around the metropolitan areas of South Africa has necessitated conditions for thorough geotechnical assessments to be conducted prior to site developments to ensure human and infrastructure safety. This paper appraises the success in the application of multi-method geophysical techniques for the delineation of sinkhole vulnerability in a residential landscape. Geophysical techniques ERT, MASW, VES, Magnetics and gravity surveys were conducted to assist in mapping sinkhole vulnerability, using an existing sinkhole as a constraint at Venterspost town, West of Johannesburg city. A combination of different geophysical techniques and results integration from those proved to be useful in the delineation of the lithologic succession around sinkhole locality, and determining the geotechnical characteristics of each layer for its contribution to the development of sinkholes, subsidence and cavities at the vicinity of the site. Study results have also assisted in the determination of the possible depth extension of the currently existing sinkhole and the location of sites where other similar karstic features and sinkholes could form. Results of the ERT, VES and MASW surveys have uncovered dolomitic bedrock at varying depths around the sites, which exhibits high resistivity values in the range 2500-8000ohm.m and corresponding high velocities in the range 1000-2400 m/s. The dolomite layer was found to be overlain by a weathered chert-poor dolomite layer, which has resistivities between the range 250-2400ohm.m, and velocities ranging from 500-600m/s, from which the large sinkhole has been found to collapse/ cave in. A compiled 2.5D high resolution Shear Wave Velocity (Vs) map of the study area was created using 2D profiles of MASW data, offering insights into the prevailing lithological setup conducive for formation various types of karstic features around the site. 3D magnetic models of the site highlighted the regions of possible subsurface interconnections between the currently existing large sinkhole and the other subsidence feature at the site. A number of depth slices were used to detail the conditions near the sinkhole as depth increases. Gravity surveys results mapped the possible formational pathways for development of new karstic features around the site. Combination and correlation of different geophysical techniques proved useful in delineation of the site geotechnical characteristics and mapping the possible depth extend of the currently existing sinkhole.

Keywords: resistivity, magnetics, sinkhole, gravity, karst, delineation, VES

Procedia PDF Downloads 58
929 Analytical Performance of Cobas C 8000 Analyzer Based on Sigma Metrics

Authors: Sairi Satari

Abstract:

Introduction: Six-sigma is a metric that quantifies the performance of processes as a rate of Defects-Per-Million Opportunities. Sigma methodology can be applied in chemical pathology laboratory for evaluating process performance with evidence for process improvement in quality assurance program. In the laboratory, these methods have been used to improve the timeliness of troubleshooting, reduce the cost and frequency of quality control and minimize pre and post-analytical errors. Aim: The aim of this study is to evaluate the sigma values of the Cobas 8000 analyzer based on the minimum requirement of the specification. Methodology: Twenty-one analytes were chosen in this study. The analytes were alanine aminotransferase (ALT), albumin, alkaline phosphatase (ALP), Amylase, aspartate transaminase (AST), total bilirubin, calcium, chloride, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, lactate dehydrogenase (LDH), magnesium, potassium, protein, sodium, triglyceride, uric acid and urea. Total error was obtained from Clinical Laboratory Improvement Amendments (CLIA). The Bias was calculated from end cycle report of Royal College of Pathologists of Australasia (RCPA) cycle from July to December 2016 and coefficient variation (CV) from six-month internal quality control (IQC). The sigma was calculated based on the formula :Sigma = (Total Error - Bias) / CV. The analytical performance was evaluated based on the sigma, sigma > 6 is world class, sigma > 5 is excellent, sigma > 4 is good and sigma < 4 is satisfactory and sigma < 3 is poor performance. Results: Based on the calculation, we found that, 96% are world class (ALT, albumin, ALP, amylase, AST, total bilirubin, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, LDH, magnesium, potassium, triglyceride and uric acid. 14% are excellent (calcium, protein and urea), and 10% ( chloride and sodium) require more frequent IQC performed per day. Conclusion: Based on this study, we found that IQC should be performed frequently for only Chloride and Sodium to ensure accurate and reliable analysis for patient management.

Keywords: sigma matrics, analytical performance, total error, bias

Procedia PDF Downloads 159
928 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality

Authors: David Sinfield, Thomas Cochrane, Marcos Steagall

Abstract:

While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.

Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications

Procedia PDF Downloads 233
927 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves

Authors: Shengnan Chen, Shuhua Wang

Abstract:

Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.

Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves

Procedia PDF Downloads 273
926 Determinants of Budget Performance in an Oil-Based Economy

Authors: Adeola Adenikinju, Olusanya E. Olubusoye, Lateef O. Akinpelu, Dilinna L. Nwobi

Abstract:

Since the enactment of the Fiscal Responsibility Act (2007), the Federal Government of Nigeria (FGN) has made public its fiscal budget and the subsequent implementation report. A critical review of these documents shows significant variations in the five macroeconomic variables which are inputs in each Presidential budget; oil Production target (mbpd), oil price ($), Foreign exchange rate(N/$), and Gross Domestic Product growth rate (%) and inflation rate (%). This results in underperformance of the Federal budget expected output in terms of non-oil and oil revenue aggregates. This paper evaluates first the existing variance between budgeted and actuals, then the relationship and causality between the determinants of Federal fiscal budget assumptions, and finally the determinants of FGN’s Gross Oil Revenue. The paper employed the use of descriptive statistics, the Autoregressive distributed lag (ARDL) model, and a Profit oil probabilistic model to achieve these objectives. This model permits for both the static and dynamic effect(s) of the independent variable(s) on the dependent variable, unlike a static model that accounts for static or fixed effect(s) only. It offers a technique for checking the existence of a long-run relationship between variables, unlike other tests of cointegration, such as the Engle-Granger and Johansen tests, which consider only non-stationary series that are integrated of the same order. Finally, even with small sample size, the ARDL model is known to generate a valid result, for it is the dependent variable and is the explanatory variable. The results showed that there is a long-run relationship between oil revenue as a proxy for budget performance and its determinants; oil price, produced oil quantity, and foreign exchange rate. There is a short-run relationship between oil revenue and its determinants; oil price, produced oil quantity, and foreign exchange rate. There is a long-run relationship between non-oil revenue and its determinants; inflation rate, GDP growth rate, and foreign exchange rate. The grangers’ causality test results show that there is a mono-directional causality between oil revenue and its determinants. The Federal budget assumptions only explain 68% of oil revenue and 62% of non-oil revenue. There is a mono-directional causality between non-oil revenue and its determinants. The Profit oil Model describes production sharing contracts, joint ventures, and modified carrying arrangements as the greatest contributors to FGN’s gross oil revenue. This provides empirical justification for the selected macroeconomic variables used in the Federal budget design and performance evaluation. The research recommends other variables, debt and money supply, be included in the Federal budget design to explain the Federal budget revenue performance further.

Keywords: ARDL, budget performance, oil price, oil quantity, oil revenue

Procedia PDF Downloads 158
925 Unveiling Microbial Potential: Investigating Zinc-Solubilizing Fungi in Rhizospheric Soil Through Isolation, Characterization and Selection

Authors: Pukhrambam Helena Chanu, Janardan Yadav

Abstract:

This study investigates the potential of various fungal isolates to solubilize zinc and counteract rice pathogens, with the aim of mitigating zinc deficiency and disease prevalence in rice farming. Soil samples from the rhizosphere were collected, and zinc-solubilizing fungi were isolated and purified. Molecular analysis identified Talaromyces sp, Talaromyces versatilis, Talaromyces pinophilus, and Aspergillus terreus as effective zinc solubilizers. Through qualitative and quantitative assessments, it was observed that solubilization efficiencies varied among the isolates over time, with Talaromyces versatilis displaying the highest capacity for solubilization. This variability in solubilization rates may be attributed to differences in fungal metabolic activity and their ability to produce organic acids that facilitate zinc release from insoluble sources in the soil. In inhibition assays against rice pathogens, the fungal isolates exhibited antagonistic properties, with Talaromyces versatilis demonstrating the most significant inhibition rates. This antagonistic activity may be linked to the production of secondary metabolites, such as antibiotics or lytic enzymes by fungi, which inhibit the growth of rice pathogens. The ability of Talaromyces versatilis to outperform other isolates in both zinc solubilization and pathogen inhibition highlights its potential as a multifunctional biocontrol agent in rice cultivation systems. These findings emphasize the potential of fungi as natural solutions for enhancing zinc uptake and managing diseases in rice cultivation. Utilizing indigenous zinc-solubilizing fungi offers a sustainable and environmentally friendly approach to addressing zinc deficiency in soils, reducing the need for chemical fertilizers. Moreover, harnessing the antagonistic activity of these fungi can contribute to integrated disease management strategies, minimizing reliance on synthetic pesticides and promoting ecological balance in agroecosystems. Additionally, the study included the evaluation of dipping time under different concentrations, viz.,10 ppm, 20 ppm, and 30 ppm of biosynthesized nano ZnO on rice seedlings. This investigation aimed to optimize the application of nano ZnO for efficient zinc uptake by rice plants while minimizing potential risks associated with excessive nanoparticle exposure. Evaluating the effects of varying concentrations and dipping durations provides valuable insights into the safe and effective utilization of nano ZnO as a micronutrient supplement in rice farming practices.

Keywords: biosynthesized nano ZnO, rice, root dipping, zinc solubilizing fungi.

Procedia PDF Downloads 37
924 Efficacy and Safety of Eucalyptus for Relief Cough Symptom: A Systematic Review and Meta-Analysis

Authors: Ladda Her, Juntip Kanjanasilp, Ratree Sawangjit, Nathorn Chaiyakunapruk

Abstract:

Cough is the common symptom of the respiratory tract infections or non-infections; the duration of cough indicates a classification and severity of disease. Herbal medicines can be used as the alternative to drugs for relief of cough symptoms from acute and chronic disease. Eucalyptus was used for reducing cough with evidences suggesting it has an active role in reduction of airway inflammation. The present study aims to evaluate efficacy and safety of eucalyptus for relief of cough symptom in respiratory disease. Method: The Cochrane Library, MEDLINE (PubMed), Scopus, CINAHL, Springer, Science direct, ProQuest, and THAILIS databases. From its inception until 01/02/2019 for randomized control trials. We follow for the efficacy and safety of eucalyptus for reducing cough. Methodological quality was evaluated by using the Cochrane risk of bias tool; two reviewers in our team screened eligibility and extracted data. Result: Six studies were included for the review and five studies were included in the meta-analysis, there were 1.911 persons including children (n: 1) and adult (n: 5) studies; for study in children and adult were between 1 and 80 years old, respectively. Eucalyptus was used as mono herb (n: 2) and in combination with other herbs form (n: 4). All of the studies with eucalyptus were compared for efficacy and safety with placebo or standard treatment, Eucalyptus dosage form in studies included capsules, spray, and syrup. Heterogeneity was 32.44 used random effect model (I² = 1.2%, χ² = 1.01; P-value = 0.314). The efficacy of eucalyptus was showed a reduced cough symptom statistically significant (n = 402, RR: 1.40, 95%CI [1.19, 1.65], P-value < 0.0001) when compared with placebo. Adverse events (AEs) were reported mild to moderate intensity with mostly gastrointestinal symptom. The methodological quality of the included trials was overall poor. Conclusion: Eucalyptus appears to be beneficial and safe for relieving in respiratory diseases focus on cough frequency. The evidence was inconclusive due to limited quality trial. Well-designed trials for evaluating the effectiveness in humans, the effectiveness for reducing cough symptom in human is needed. Eucalyptus had safety as monotherapy or in combination with other herbs.

Keywords: cough, eucalyptus, cineole, herbal medicine, systematic review, meta-analysis

Procedia PDF Downloads 143
923 Epigenetic and Archeology: A Quest to Re-Read Humanity

Authors: Salma A. Mahmoud

Abstract:

Epigenetic, or alteration in gene expression influenced by extragenetic factors, has emerged as one of the most promising areas that will address some of the gaps in our current knowledge in understanding patterns of human variation. In the last decade, the research investigating epigenetic mechanisms in many fields has flourished and witnessed significant progress. It paved the way for a new era of integrated research especially between anthropology/archeology and life sciences. Skeletal remains are considered the most significant source of information for studying human variations across history, and by utilizing these valuable remains, we can interpret the past events, cultures and populations. In addition to archeological, historical and anthropological importance, studying bones has great implications in other fields such as medicine and science. Bones also can hold within them the secrets of the future as they can act as predictive tools for health, society characteristics and dietary requirements. Bones in their basic forms are composed of cells (osteocytes) that are affected by both genetic and environmental factors, which can only explain a small part of their variability. The primary objective of this project is to examine the epigenetic landscape/signature within bones of archeological remains as a novel marker that could reveal new ways to conceptualize chronological events, gender differences, social status and ecological variations. We attempted here to address discrepancies in common variants such as methylome as well as novel epigenetic regulators such as chromatin remodelers, which to our best knowledge have not yet been investigated by anthropologists/ paleoepigenetists using plethora of techniques (biological, computational, and statistical). Moreover, extracting epigenetic information from bones will highlight the importance of osseous material as a vector to study human beings in several contexts (social, cultural and environmental), and strengthen their essential role as model systems that can be used to investigate and construct various cultural, political and economic events. We also address all steps required to plan and conduct an epigenetic analysis from bone materials (modern and ancient) as well as discussing the key challenges facing researchers aiming to investigate this field. In conclusion, this project will serve as a primer for bioarcheologists/anthropologists and human biologists interested in incorporating epigenetic data into their research programs. Understanding the roles of epigenetic mechanisms in bone structure and function will be very helpful for a better comprehension of their biology and highlighting their essentiality as interdisciplinary vectors and a key material in archeological research.

Keywords: epigenetics, archeology, bones, chromatin, methylome

Procedia PDF Downloads 97
922 The Preventive Effect of Metformin on Paclitaxel-Induced Peripheral Neuropathy

Authors: AliAkbar Hafezi, Jamshid Abedi, Jalal Taherian, Behnam Kadkhodaei, Mahsa Elahi

Abstract:

Background. Peripheral neuropathy is a common side effect of the administration of neurotoxic chemotherapy agents. This adverse effect is a major dose-limiting factor of many commonly used chemotherapy drugs. Currently, there are no Food and Drug Administration (FDA) approved medications for the prevention or treatment of chemotherapy-induced peripheral neuropathy. Therefore, this study was performed to investigate the efficacy and safety of metformin on paclitaxel-induced peripheral neuropathy (PIPN). Methods. In this randomized clinical trial, cancer patients who were candidates for chemotherapy with paclitaxel referred to the radiation oncology departments in Iran from 2022 to 2023 were studied. Patients were randomly divided into two groups; 1- Case group (n = 30) received metformin 500 mg orally twice a day after meals during chemotherapy with paclitaxel, and 2- Control group (30 people) received chemotherapy without metformin or any additional medication. Patients were visited in terms of numbness or other neurological symptoms two weeks before chemotherapy, 1-2 days before and weekly during chemotherapy, and at the end of the study. They were assessed by nerve conduction study (NCS) before intervention and one week after the end of chemotherapy. The primary outcome was the efficacy in reducing PIPN and the secondary outcome was adverse effects. Eventually, the outcomes were compared between the two groups of patients. Results. A total of 60 female cancer patients receiving chemotherapy with paclitaxel were evaluated in two groups. The groups were matched in terms of age, body mass index, fasting blood sugar, smoking, pathologic stage, and creatinine levels. The results showed that 18 patients (60.0 %) in the case group and 23 patients (76.6 %) in the control group had PIPN clinically (P = 0.267), and NCS showed 11 patients (36.6 %) in the case group and 15 patients (50.0 %) in the control group suffered from PIPN which no significant difference was observed between the two groups (P = 0.435). Diarrhea (n = 3; 10.0 %) and nausea (n = 3; 10.0 %) were the most common side effects of metformin in the case group and no serious side effects (lactic acidosis and anemia) were found in these patients. Conclusion. This study indicated that metformin did not significantly prevent PIPN in cancer patients receiving chemotherapy, although the frequency of peripheral neuropathy in the case group was lower than in the control group. The use of metformin in the patients had acceptable safety and no serious side effects were reported.

Keywords: peripheral neuropathy, chemotherapy, paclitaxel, metformin

Procedia PDF Downloads 28
921 Trends in All-Cause Mortality and Inpatient and Outpatient Visits for Ambulatory Care Sensitive Conditions during the First Year of the COVID-19 Pandemic: A Population-Based Study

Authors: Tetyana Kendzerska, David T. Zhu, Michael Pugliese, Douglas Manuel, Mohsen Sadatsafavi, Marcus Povitz, Therese A. Stukel, Teresa To, Shawn D. Aaron, Sunita Mulpuru, Melanie Chin, Claire E. Kendall, Kednapa Thavorn, Rebecca Robillard, Andrea S. Gershon

Abstract:

The impact of the COVID-19 pandemic on the management of ambulatory care sensitive conditions (ACSCs) remains unknown. To compare observed and expected (projected based on previous years) trends in all-cause mortality and healthcare use for ACSCs in the first year of the pandemic (March 2020 - March 2021). A population-based study using provincial health administrative data.General adult population (Ontario, Canada). Monthly all-cause mortality, and hospitalizations, emergency department (ED) and outpatient visit rates (per 100,000 people at-risk) for seven combined ACSCs (asthma, COPD, angina, congestive heart failure, hypertension, diabetes, and epilepsy) during the first year were compared with similar periods in previous years (2016-2019) by fitting monthly time series auto-regressive integrated moving-average models. Compared to previous years, all-cause mortality rates increased at the beginning of the pandemic (observed rate in March-May 2020 of 79.98 vs. projected of 71.24 [66.35-76.50]) and then returned to expected in June 2020—except among immigrants and people with mental health conditions where they remained elevated. Hospitalization and ED visit rates for ACSCs remained lower than projected throughout the first year: observed hospitalization rate of 37.29 vs. projected of 52.07 (47.84-56.68); observed ED visit rate of 92.55 vs. projected of 134.72 (124.89-145.33). ACSC outpatient visit rates decreased initially (observed rate of 4,299.57 vs. projected of 5,060.23 [4,712.64-5,433.46]) and then returned to expected in June 2020. Reductions in outpatient visits for ACSCs at the beginning of the pandemic combined with reduced hospital admissions may have been associated with temporally increased mortality—disproportionately experienced by immigrants and those with mental health conditions. The Ottawa Hospital Academic Medical Organization

Keywords: COVID-19, chronic disease, all-cause mortality, hospitalizations, emergency department visits, outpatient visits, modelling, population-based study, asthma, COPD, angina, heart failure, hypertension, diabetes, epilepsy

Procedia PDF Downloads 81
920 Assessing the Competitiveness of Green Charcoal Energy as an Alternative Source of Cooking Fuel in Uganda

Authors: Judith Awacorach, Quentin Gausset

Abstract:

Wood charcoal and firewood are the primary sources of energy for cooking fuel in most Sub-Saharan African countries, including Uganda. This leads to unsustainable forest use and to rapid deforestation. Green charcoal (made out of agricultural residues that are carbonized, reduced in char powder, and glued in briquettes, using a binder such as sugar molasse, cassava flour or clay) is a promising and sustainable alternative to wood charcoal and firewood. It is considered as renewable energy because the carbon emissions released by the combustion of green charcoal are immediately captured again in the next agricultural cycle. If practiced on a large scale, this has the potential to replace wood charcoal and stop deforestation. However, the uptake of green charcoal for cooking remains low in Uganda despite the introduction of the technology 15 years ago. The present paper reviews the barriers to the production and commercialization of green charcoal. The paper is based on the study of 13 production sites, recording the raw materials used, the production techniques, the quantity produced, the frequency of production, and the business model. Observations were made on each site, and interviews were conducted with the managers of the facilities and with one or two employees in the larger facilities. We also interviewed project administrators from four funding agencies interested in financing green charcoal production. The results of our research identify the main barriers as follows: 1) The price of green charcoal is not competitive (it is more labor and capital-intensive than wood charcoal). 2) There is a problem with quality control and labeling (one finds a wide variety of green charcoal with very different performances). 3) The carbonization of agricultural crop residues is a major bottleneck in green char production. Most briquettes are produced with wood charcoal dust or powder, which is a by-product of wood charcoal. As such, they increase the efficiency of wood charcoal but do not yet replace it. 4) There is almost no marketing chain for the product (most green charcoal is sold directly from producer to consumer without any middleman). 5) The financing institutions are reluctant to lend money for this kind of activity. 6) Storage can be challenging since briquettes can dissolve due to moisture. In conclusion, a number of important barriers need to be overcome before green charcoal can become a serious alternative to wood charcoal.

Keywords: briquettes, competitiveness, deforestation, green charcoal, renewable energy

Procedia PDF Downloads 34
919 The Diurnal and Seasonal Relationships of Pedestrian Injuries Secondary to Motor Vehicles in Young People

Authors: Amina Akhtar, Rory O'Connor

Abstract:

Introduction: There remains significant morbidity and mortality in young pedestrians hit by motor vehicles, even in the era of pedestrian crossings and speed limits. The aim of this study was to compare incidence and injury severity of motor vehicle-related pedestrian trauma according to time of day and season in a young population, based on the supposition that injuries would be more prevalent during dusk and dawn and during autumn and winter. Methods: Data was retrieved for patients between 10-25 years old from the National Trauma Audit and Research Network (TARN) database who had been involved as pedestrians in motor vehicle accidents between 2015-2020. The incidence of injuries, their severity (using the Injury Severity Score [ISS]), hospital transfer time, and mortality were analysed according to the hours of daylight, darkness, and season. Results: The study identified a seasonal pattern, showing that autumn was the predominant season and led to 34.9% of injuries, with a further 25.4% in winter in comparison to spring and summer, with 21.4% and 18.3% of injuries, respectively. However, visibility alone was not a sufficient factor as 49.5% of injuries occurred during the time of darkness, while 50.5% occurred during daylight. Importantly, the greatest injury rate (number of injuries/hour) occurred between 1500-1630, correlating to school pick-up times. A further significant relationship between injury severity score (ISS) and daylight was demonstrated (p-value= 0.0124), with moderate injuries (ISS 9-14) occurring most commonly during the day (72.7%) and more severe injuries (ISS>15) occurred during the night (55.8%). Conclusion: We have identified a relationship between time of day and the frequency and severity of pedestrian trauma in young people. In addition, particular time groupings correspond to the greatest injury rate, suggesting that reduced visibility coupled with school pick-up times may play a significant role. This could be addressed through a targeted public health approach to implementing change. We recommend targeted public health measures to improve road safety that focus on these times and that increase the visibility of children combined with education for drivers.

Keywords: major trauma, paediatric trauma, road traffic accidents, diurnal pattern

Procedia PDF Downloads 84
918 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System

Authors: Nareshkumar Harale, B. B. Meshram

Abstract:

The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.

Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design

Procedia PDF Downloads 219
917 Data Analysis Tool for Predicting Water Scarcity in Industry

Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse

Abstract:

Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.

Keywords: data mining, industry, machine Learning, shortage, water resources

Procedia PDF Downloads 113
916 Using Chatbots to Create Situational Content for Coursework

Authors: B. Bricklin Zeff

Abstract:

This research explores the development and application of a specialized chatbot tailored for a nursing English course, with a primary objective of augmenting student engagement through situational content and responsiveness to key expressions and vocabulary. Introducing the chatbot, elucidating its purpose, and outlining its functionality are crucial initial steps in the research study, as they provide a comprehensive foundation for understanding the design and objectives of the specialized chatbot developed for the nursing English course. These elements establish the context for subsequent evaluations and analyses, enabling a nuanced exploration of the chatbot's impact on student engagement and language learning within the nursing education domain. The subsequent exploration of the intricate language model development process underscores the fusion of scientific methodologies and artistic considerations in this application of artificial intelligence (AI). Tailored for educators and curriculum developers in nursing, practical principles extending beyond AI and education are considered. Some insights into leveraging technology for enhanced language learning in specialized fields are addressed, with potential applications of similar chatbots in other professional English courses. The overarching vision is to illuminate how AI can transform language learning, rendering it more interactive and contextually relevant. The presented chatbot is a tangible example, equipping educators with a practical tool to enhance their teaching practices. Methodologies employed in this research encompass surveys and discussions to gather feedback on the chatbot's usability, effectiveness, and potential improvements. The chatbot system was integrated into a nursing English course, facilitating the collection of valuable feedback from participants. Significant findings from the study underscore the chatbot's effectiveness in encouraging more verbal practice of target expressions and vocabulary necessary for performance in role-play assessment strategies. This outcome emphasizes the practical implications of integrating AI into language education in specialized fields. This research holds significance for educators and curriculum developers in the nursing field, offering insights into integrating technology for enhanced English language learning. The study's major findings contribute valuable perspectives on the practical impact of the chatbot on student interaction and verbal practice. Ultimately, the research sheds light on the transformative potential of AI in making language learning more interactive and contextually relevant, particularly within specialized domains like nursing.

Keywords: chatbot, nursing, pragmatics, role-play, AI

Procedia PDF Downloads 47
915 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0

Authors: Harris Niavis, Dimitra Politaki

Abstract:

The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.

Keywords: blockchain, data quality, industry4.0, product quality

Procedia PDF Downloads 171
914 Long-Term Resilience Performance Assessment of Dual and Singular Water Distribution Infrastructures Using a Complex Systems Approach

Authors: Kambiz Rasoulkhani, Jeanne Cole, Sybil Sharvelle, Ali Mostafavi

Abstract:

Dual water distribution systems have been proposed as solutions to enhance the sustainability and resilience of urban water systems by improving performance and decreasing energy consumption. The objective of this study was to evaluate the long-term resilience and robustness of dual water distribution systems versus singular water distribution systems under various stressors such as demand fluctuation, aging infrastructure, and funding constraints. To this end, the long-term dynamics of these infrastructure systems was captured using a simulation model that integrates institutional agency decision-making processes with physical infrastructure degradation to evaluate the long-term transformation of water infrastructure. A set of model parameters that varies for dual and singular distribution infrastructure based on the system attributes, such as pipes length and material, energy intensity, water demand, water price, average pressure and flow rate, as well as operational expenditures, were considered and input in the simulation model. Accordingly, the model was used to simulate various scenarios of demand changes, funding levels, water price growth, and renewal strategies. The long-term resilience and robustness of each distribution infrastructure were evaluated based on various performance measures including network average condition, break frequency, network leakage, and energy use. An ecologically-based resilience approach was used to examine regime shifts and tipping points in the long-term performance of the systems under different stressors. Also, Classification and Regression Tree analysis was adopted to assess the robustness of each system under various scenarios. Using data from the City of Fort Collins, the long-term resilience and robustness of the dual and singular water distribution systems were evaluated over a 100-year analysis horizon for various scenarios. The results of the analysis enabled: (i) comparison between dual and singular water distribution systems in terms of long-term performance, resilience, and robustness; (ii) identification of renewal strategies and decision factors that enhance the long-term resiliency and robustness of dual and singular water distribution systems under different stressors.

Keywords: complex systems, dual water distribution systems, long-term resilience performance, multi-agent modeling, sustainable and resilient water systems

Procedia PDF Downloads 278
913 Extracting Opinions from Big Data of Indonesian Customer Reviews Using Hadoop MapReduce

Authors: Veronica S. Moertini, Vinsensius Kevin, Gede Karya

Abstract:

Customer reviews have been collected by many kinds of e-commerce websites selling products, services, hotel rooms, tickets and so on. Each website collects its own customer reviews. The reviews can be crawled, collected from those websites and stored as big data. Text analysis techniques can be used to analyze that data to produce summarized information, such as customer opinions. Then, these opinions can be published by independent service provider websites and used to help customers in choosing the most suitable products or services. As the opinions are analyzed from big data of reviews originated from many websites, it is expected that the results are more trusted and accurate. Indonesian customers write reviews in Indonesian language, which comes with its own structures and uniqueness. We found that most of the reviews are expressed with “daily language”, which is informal, do not follow the correct grammar, have many abbreviations and slangs or non-formal words. Hadoop is an emerging platform aimed for storing and analyzing big data in distributed systems. A Hadoop cluster consists of master and slave nodes/computers operated in a network. Hadoop comes with distributed file system (HDFS) and MapReduce framework for supporting parallel computation. However, MapReduce has weakness (i.e. inefficient) for iterative computations, specifically, the cost of reading/writing data (I/O cost) is high. Given this fact, we conclude that MapReduce function is best adapted for “one-pass” computation. In this research, we develop an efficient technique for extracting or mining opinions from big data of Indonesian reviews, which is based on MapReduce with one-pass computation. In designing the algorithm, we avoid iterative computation and instead adopt a “look up table” technique. The stages of the proposed technique are: (1) Crawling the data reviews from websites; (2) cleaning and finding root words from the raw reviews; (3) computing the frequency of the meaningful opinion words; (4) analyzing customers sentiments towards defined objects. The experiments for evaluating the performance of the technique were conducted on a Hadoop cluster with 14 slave nodes. The results show that the proposed technique (stage 2 to 4) discovers useful opinions, is capable of processing big data efficiently and scalable.

Keywords: big data analysis, Hadoop MapReduce, analyzing text data, mining Indonesian reviews

Procedia PDF Downloads 195
912 Learners' Perception of Digitalization of Medical Education in a Low Middle-Income Country – A Case Study of the Lecturio Platform

Authors: Naomi Nathan

Abstract:

Introduction Digitalization of medical education can revolutionize how medical students learn and interact with the medical curriculum across contexts. With the increasing availability of the internet and mobile connectivity in LMICs, online medical education platforms and digital learning tools are becoming more widely available, providing new opportunities for learners to access high-quality medical education and training. However, the adoption and integration of digital technologies in medical education in LMICs is a complex process influenced by various factors, including learners' perceptions and attitudes toward digital learning. In Ethiopia, the adoption of digital platforms for medical education has been slow, with traditional face-to-face teaching methods still being the norm. However, as access to technology improves and more universities adopt digital platforms, it is crucial to understand how medical students perceive this shift. Methodology This study investigated medical students' perception of the digitalization of medical education in relation to their access to the Lecturio Digital Medical Education Platform through a capacity-building project. 740 medical students from over 20 medical universities participated in the study. The students were surveyed using a questionnaire that included their attitudes toward the digitalization of medical education, their frequency of use of the digital platform, and their perceived benefits and challenges. Results The study results showed that most medical students had a positive attitude toward digitalizing medical education. The most commonly cited benefit was the convenience and flexibility of accessing course material/curriculum online. Many students also reported that they found the platform more interactive and engaging, leading to a more meaningful learning experience. The study also identified several challenges medical students faced when using the platform. The most commonly reported challenge was the need for more reliable internet access, which made it difficult for students to access content consistently. Overall, the results of this study suggest that medical students in Ethiopia have a positive perception of the digitalization of medical education. Over 97% of students continuously expressed a need for access to the Lecturio platform throughout their studies. Conclusion Significant challenges still need to be addressed to fully realize the Lecturio digital platform's benefits. Universities, relevant ministries, and various stakeholders must work together to address these challenges to ensure that medical students fully participate in and benefit from digitalized medical education - sustainably and effectively.

Keywords: digital medical education, EdTech, LMICs, e-learning

Procedia PDF Downloads 80
911 Differences in Assessing Hand-Written and Typed Student Exams: A Corpus-Linguistic Study

Authors: Jutta Ransmayr

Abstract:

The digital age has long arrived at Austrian schools, so both society and educationalists demand that digital means should be integrated accordingly to day-to-day school routines. Therefore, the Austrian school-leaving exam (A-levels) can now be written either by hand or by using a computer. However, the choice of writing medium (pen and paper or computer) for written examination papers, which are considered 'high-stakes' exams, raises a number of questions that have not yet been adequately investigated and answered until recently, such as: What effects do the different conditions of text production in the written German A-levels have on the component of normative linguistic accuracy? How do the spelling skills of German A-level papers written with a pen differ from those that the students wrote on the computer? And how is the teacher's assessment related to this? Which practical desiderata for German didactics can be derived from this? In a trilateral pilot project of the Austrian Center for Digital Humanities (ACDH) of the Austrian Academy of Sciences and the University of Vienna in cooperation with the Austrian Ministry of Education and the Council for German Orthography, these questions were investigated. A representative Austrian learner corpus, consisting of around 530 German A-level papers from all over Austria (pen and computer written), was set up in order to subject it to a quantitative (corpus-linguistic and statistical) and qualitative investigation with regard to the spelling and punctuation performance of the high school graduates and the differences between pen- and computer-written papers and their assessments. Relevant studies are currently available mainly from the Anglophone world. These have shown that writing on the computer increases the motivation to write, has positive effects on the length of the text, and, in some cases, also on the quality of the text. Depending on the writing situation and other technical aids, better results in terms of spelling and punctuation could also be found in the computer-written texts as compared to the handwritten ones. Studies also point towards a tendency among teachers to rate handwritten texts better than computer-written texts. In this paper, the first comparable results from the German-speaking area are to be presented. Research results have shown that, on the one hand, there are significant differences between handwritten and computer-written work with regard to performance in orthography and punctuation. On the other hand, the corpus linguistic investigation and the subsequent statistical analysis made it clear that not only the teachers' assessments of the students’ spelling performance vary enormously but also the overall assessments of the exam papers – the factor of the production medium (pen and paper or computer) also seems to play a decisive role.

Keywords: exam paper assessment, pen and paper or computer, learner corpora, linguistics

Procedia PDF Downloads 157
910 Thermal Comfort and Outdoor Urban Spaces in the Hot Dry City of Damascus, Syria

Authors: Lujain Khraiba

Abstract:

Recently, there is a broad recognition that micro-climate conditions contribute to the quality of life in urban spaces outdoors, both from economical and social viewpoints. The consideration of urban micro-climate and outdoor thermal comfort in urban design and planning processes has become one of the important aspects in current related studies. However, these aspects are so far not considered in urban planning regulations in practice and these regulations are often poorly adapted to the local climate and culture. Therefore, there is a huge need to adapt the existing planning regulations to the local climate especially in cities that have extremely hot weather conditions. The overall aim of this study is to point out the complexity of the relationship between urban planning regulations, urban design, micro-climate and outdoor thermal comfort in the hot dry city of Damascus, Syria. The main aim is to investigate the temporal and spatial effects of micro-climate on urban surface temperatures and outdoor thermal comfort in different urban design patterns as a result of urban planning regulations during the extreme summer conditions. In addition, studying different alternatives of how to mitigate the surface temperature and thermal stress is also a part of the aim. The novelty of this study is to highlight the combined effect of urban surface materials and vegetation to develop the thermal environment. This study is based on micro-climate simulations using ENVI-met 3.1. The input data is calibrated according to a micro-climate fieldwork that has been conducted in different urban zones in Damascus. Different urban forms and geometries including the old and the modern parts of Damascus are thermally evaluated. The Physiological Equivalent Temperature (PET) index is used as an indicator for outdoor thermal comfort analysis. The study highlights the shortcomings of existing planning regulations in terms of solar protection especially at street levels. The results show that the surface temperatures in Old Damascus are lower than in the modern part. This is basically due to the difference in urban geometries that prevent the solar radiation in Old Damascus to reach the ground and heat up the surface whereas in modern Damascus, the streets are prescribed as wide spaces with high values of Sky View Factor (SVF is about 0.7). Moreover, the canyons in the old part are paved in cobblestones whereas the asphalt is the main material used in the streets of modern Damascus. Furthermore, Old Damascus is less stressful than the modern part (the difference in PET index is about 10 °C). The thermal situation is enhanced when different vegetation are considered (an improvement of 13 °C in the surface temperature is recorded in modern Damascus). The study recommends considering a detailed landscape code at street levels to be integrated in urban regulations of Damascus in order to achieve a better urban development in harmony with micro-climate and comfort. Such strategy will be very useful to decrease the urban warming in the city.

Keywords: micro-climate, outdoor thermal comfort, urban planning regulations, urban spaces

Procedia PDF Downloads 472
909 Advances in Health Risk Assessment of Mycotoxins in Africa

Authors: Wilfred A. Abiaa, Chibundu N. Ezekiel, Benedikt Warth, Michael Sulyok, Paul C. Turner, Rudolf Krska, Paul F. Moundipa

Abstract:

Mycotoxins are a wide range of toxic secondary metabolites of fungi that contaminate various food commodities worldwide especially in sub-Saharan Africa (SSA). Such contamination seriously compromises food safety and quality posing a serious problem for human health as well as to trade and the economy. Their concentrations depend on various factors, such as the commodity itself, climatic conditions, storage conditions, seasonal variances, and processing methods. When humans consume foods contaminated by mycotoxins, they exert toxic effects to their health through various modes of actions. Rural populations in sub-Saharan Africa, are exposed to dietary mycotoxins, but it is supposed that exposure levels and health risks associated with mycotoxins between SSA countries may vary. Dietary exposures and health risk assessment studies have been limited by lack of equipment for the proper assessment of the associated health implications on consumer populations when they eat contaminated agricultural products. As such, mycotoxin research is premature in several SSA nations with product evaluation for mycotoxin loads below/above legislative limits being inadequate. Few nations have health risk assessment reports mainly based on direct quantification of the toxins in foods ('external exposure') and linking food levels with data from food frequency questionnaires. Nonetheless, the assessment of the exposure and health risk to mycotoxins requires more than the traditional approaches. Only a fraction of the mycotoxins in contaminated foods reaches the blood stream and exert toxicity ('internal exposure'). Also, internal exposure is usually smaller than external exposure thus dependence on external exposure alone may induce confounders in risk assessment. Some studies from SSA earlier focused on biomarker analysis mainly on aflatoxins while a few recent studies have concentrated on the multi-biomarker analysis of exposures in urine providing probable associations between observed disease occurrences and dietary mycotoxins levels. As a result, new techniques that could assess the levels of exposures directly in body tissue or fluid, and possibly link them to the disease state of individuals became urgent.

Keywords: mycotoxins, biomarkers, exposure assessment, health risk assessment, sub-Saharan Africa

Procedia PDF Downloads 561
908 Foot Self-Monitoring Knowledge, Attitude, Practice, and Related Factors among Diabetic Patients: A Descriptive and Correlational Study in a Taiwan Teaching Hospital

Authors: Li-Ching Lin, Yu-Tzu Dai

Abstract:

Recurrent foot ulcers or foot amputation have a major impact on patients with diabetes mellitus (DM), medical professionals, and society. A critical procedure for foot care is foot self-monitoring. Medical professionals’ understanding of patients’ foot self-monitoring knowledge, attitude, and practice is beneficial for raising patients’ disease awareness. This study investigated these and related factors among patients with DM through a descriptive study of the correlations. A scale for measuring the foot self-monitoring knowledge, attitude, and practice of patients with DM was used. Purposive sampling was adopted, and 100 samples were collected from the respondents’ self-reports or from interviews. The statistical methods employed were an independent-sample t-test, one-way analysis of variance, Pearson correlation coefficient, and multivariate regression analysis. The findings were as follows: the respondents scored an average of 12.97 on foot self-monitoring knowledge, and the correct answer rate was 68.26%. The respondents performed relatively lower in foot health screenings and recording, and awareness of neuropathy in the foot. The respondents held a positive attitude toward self-monitoring their feet and a negative attitude toward having others check the soles of their feet. The respondents scored an average of 12.64 on foot self-monitoring practice. Their scores were lower in their frequency of self-monitoring their feet, recording their self-monitoring results, checking their pedal pulse, and examining if their soles were red immediately after taking off their shoes. Significant positive correlations were observed among foot self-monitoring knowledge, attitude, and practice. The correlation coefficient between self-monitoring knowledge and self-monitoring practice was 0.20, and that between self-monitoring attitude and self-monitoring practice was 0.44. Stepwise regression analysis revealed that the main predictive factors of the foot self-monitoring practice in patients with DM were foot self-monitoring attitude, prior experience in foot care, and an educational attainment of college or higher. These factors predicted 33% of the variance. This study concludes that patients with DM lacked foot self-monitoring practice and advises that the patients’ self-monitoring abilities be evaluated first, including whether patients have poor eyesight, difficulties in bending forward due to obesity, and people who can assist them in self-monitoring. In addition, patient education should emphasize self-monitoring knowledge and practice, such as perceptions regarding the symptoms of foot neurovascular lesions, pulse monitoring methods, and new foot self-monitoring equipment. By doing so, new or recurring ulcers may be discovered in their early stages.

Keywords: diabetic foot, foot self-monitoring attitude, foot self-monitoring knowledge, foot self-monitoring practice

Procedia PDF Downloads 183
907 Numerical Investigation of the Effects of Surfactant Concentrations on the Dynamics of Liquid-Liquid Interfaces

Authors: Bamikole J. Adeyemi, Prashant Jadhawar, Lateef Akanji

Abstract:

Theoretically, there exist two mathematical interfaces (fluid-solid and fluid-fluid) when a liquid film is present on solid surfaces. These interfaces overlap if the mineral surface is oil-wet or mixed wet, and therefore, the effects of disjoining pressure are significant on both boundaries. Hence, dewetting is a necessary process that could detach oil from the mineral surface. However, if the thickness of the thin water film directly in contact with the surface is large enough, disjoining pressure can be thought to be zero at the liquid-liquid interface. Recent studies show that the integration of fluid-fluid interactions with fluid-rock interactions is an important step towards a holistic approach to understanding smart water effects. Experiments have shown that the brine solution can alter the micro forces at oil-water interfaces, and these ion-specific interactions lead to oil emulsion formation. The natural emulsifiers present in crude oil behave as polyelectrolytes when the oil interfaces with low salinity water. Wettability alteration caused by low salinity waterflooding during Enhanced Oil Recovery (EOR) process results from the activities of divalent ions. However, polyelectrolytes are said to lose their viscoelastic property with increasing cation concentrations. In this work, the influence of cation concentrations on the dynamics of viscoelastic liquid-liquid interfaces is numerically investigated. The resultant ion concentrations at the crude oil/brine interfaces were estimated using a surface complexation model. Subsequently, the ion concentration parameter is integrated into a mathematical model to describe its effects on the dynamics of a viscoelastic interfacial thin film. The film growth, stability, and rupture were measured after different time steps for three types of fluids (Newtonian, purely elastic and viscoelastic fluids). The interfacial films respond to exposure time in a similar manner with an increasing growth rate, which resulted in the formation of more droplets with time. Increased surfactant accumulation at the interface results in a higher film growth rate which leads to instability and subsequent formation of more satellite droplets. Purely elastic and viscoelastic properties limit film growth rate and consequent film stability compared to the Newtonian fluid. Therefore, low salinity and reduced concentration of the potential determining ions in injection water will lead to improved interfacial viscoelasticity.

Keywords: liquid-liquid interfaces, surfactant concentrations, potential determining ions, residual oil mobilization

Procedia PDF Downloads 134
906 Designing a Socio-Technical System for Groundwater Resources Management, Applying Smart Energy and Water Meter

Authors: S. Mahdi Sadatmansouri, Maryam Khalili

Abstract:

World, nowadays, encounters serious water scarcity problem. During the past few years, by advent of Smart Energy and Water Meter (SEWM) and its installation at the electro-pumps of the water wells, one had believed that it could be the golden key to address the groundwater resources over-pumping issue. In fact, implementation of these Smart Meters managed to control the water table drawdown for short; but it was not a sustainable approach. SEWM has been considered as law enforcement facility at first; however, for solving a complex socioeconomic problem like shared groundwater resources management, more than just enforcement is required: participation to conserve common resources. The well owners or farmers, as water consumers, are the main and direct stakeholders of this system and other stakeholders could be government sectors, investors, technology providers, privet sectors or ordinary people. Designing a socio-technical system not only defines the role of each stakeholder but also can lubricate the communication to reach the system goals while benefits of each are considered and provided. Farmers, as the key participators for solving groundwater problem, do not trust governments but they would trust a fair system in which responsibilities, privileges and benefits are clear. Technology could help this system remained impartial and productive. Social aspects provide rules, regulations, social objects and etc. for the system and help it to be more human-centered. As the design methodology, Design Thinking provides probable solutions for the challenging problems and ongoing conflicts; it could enlighten the way in which the final system could be designed. Using Human Centered Design approach of IDEO helps to keep farmers in the center of the solution and provides a vision by which stakeholders’ requirements and needs are addressed effectively. Farmers would be considered to trust the system and participate in their groundwater resources management if they find the rules and tools of the system fair and effective. Besides, implementation of the socio-technical system could change farmers’ behavior in order that they concern more about their valuable shared water resources as well as their farm profit. This socio-technical system contains nine main subsystems: 1) Measurement and Monitoring system, 2) Legislation and Governmental system, 3) Information Sharing system, 4) Knowledge based NGOs, 5) Integrated Farm Management system (using IoT), 6) Water Market and Water Banking system, 7) Gamification, 8) Agribusiness ecosystem, 9) Investment system.

Keywords: human centered design, participatory management, smart energy and water meter (SEWM), social object, socio-technical system, water table drawdown

Procedia PDF Downloads 287
905 Sustainability from Ecocity to Ecocampus: An Exploratory Study on Spanish Universities' Water Management

Authors: Leyla A. Sandoval Hamón, Fernando Casani

Abstract:

Sustainability has been integrated into the cities’ agenda due to the impact that they generate. The dimensions of greater proliferation of sustainability, which are taken as a reference, are economic, social and environmental. Thus, the decisions of management of the sustainable cities search a balance between these dimensions in order to provide environment-friendly alternatives. In this context, urban models (where water consumption, energy consumption, waste production, among others) that have emerged in harmony with the environment, are known as Ecocity. A similar model, but on a smaller scale, is ‘Ecocampus’ that is developed in universities (considered ‘small cities’ due to its complex structure). So, sustainable practices are being implemented in the management of university campus activities, following different relevant lines of work. The universities have a strategic role in society, and their activities can strengthen policies, strategies, and measures of sustainability, both internal and external to the organization. Because of their mission in knowledge creation and transfer, these institutions can promote and disseminate more advanced activities in sustainability. This model replica also implies challenges in the sustainable management of water, energy, waste, transportation, among others, inside the campus. The challenge that this paper focuses on is the water management, taking into account that the universities consume big amounts of this resource. The purpose of this paper is to analyze the sustainability experience, with emphasis on water management, of two different campuses belonging to two different Spanish universities - one urban campus in a historic city and the other a suburban campus in the outskirts of a large city. Both universities are in the top hundred of international rankings of sustainable universities. The methodology adopts a qualitative method based on the technique of in-depth interviews and focus-group discussions with administrative and academic staff of the ‘Ecocampus’ offices, the organizational units for sustainability management, from the two Spanish universities. The hypotheses indicate that sustainable policies in terms of water management are best in campuses without big green spaces and where the buildings are built or rebuilt with modern style. The sustainability efforts of the university are independent of the kind of (urban – suburban) campus but an important aspect to improve is the degree of awareness of the university community about water scarcity. In general, the paper suggests that higher institutions adapt their sustainability policies depending on the location and features of the campus and their engagement with the water conservation. Many Spanish universities have proposed policies, good practices, and measures of sustainability. In fact, some offices or centers of Ecocampus have been founded. The originality of this study is to learn from the different experiences of sustainability policies of universities.

Keywords: ecocampus, ecocity, sustainability, water management

Procedia PDF Downloads 210
904 Multi-Criteria Evolutionary Algorithm to Develop Efficient Schedules for Complex Maintenance Problems

Authors: Sven Tackenberg, Sönke Duckwitz, Andreas Petz, Christopher M. Schlick

Abstract:

This paper introduces an extension to the well-established Resource-Constrained Project Scheduling Problem (RCPSP) to apply it to complex maintenance problems. The problem is to assign technicians to a team which has to process several tasks with multi-level skill requirements during a work shift. Here, several alternative activities for a task allow both, the temporal shift of activities or the reallocation of technicians and tools. As a result, switches from one valid work process variant to another can be considered and may be selected by the developed evolutionary algorithm based on the present skill level of technicians or the available tools. An additional complication of the observed scheduling problem is that the locations of the construction sites are only temporarily accessible during a day. Due to intensive rail traffic, the available time slots for maintenance and repair works are extremely short and are often distributed throughout the day. To identify efficient working periods, a first concept of a Bayesian network is introduced and is integrated into the extended RCPSP with pre-emptive and non-pre-emptive tasks. Thereby, the Bayesian network is used to calculate the probability of a maintenance task to be processed during a specific period of the shift. Focusing on the domain of maintenance of the railway infrastructure in metropolitan areas as the most unproductive implementation process at construction site, the paper illustrates how the extended RCPSP can be applied for maintenance planning support. A multi-criteria evolutionary algorithm with a problem representation is introduced which is capable of revising technician-task allocations, whereas the duration of the task may be stochastic. The approach uses a novel activity list representation to ensure easily describable and modifiable elements which can be converted into detailed shift schedules. Thereby, the main objective is to develop a shift plan which maximizes the utilization of each technician due to a minimization of the waiting times caused by rail traffic. The results of the already implemented core algorithm illustrate a fast convergence towards an optimal team composition for a shift, an efficient sequence of tasks and a high probability of the subsequent implementation due to the stochastic durations of the tasks. In the paper, the algorithm for the extended RCPSP is analyzed in experimental evaluation using real-world example problems with various size, resource complexity, tightness and so forth.

Keywords: maintenance management, scheduling, resource constrained project scheduling problem, genetic algorithms

Procedia PDF Downloads 220