Search results for: Signal Processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5006

Search results for: Signal Processing

56 The Stability of Vegetable-Based Synbiotic Drink during Storage

Authors: Camelia Vizireanu, Daniela Istrati, Alina Georgiana Profir, Rodica Mihaela Dinica

Abstract:

Globally, there is a great interest in promoting the consumption of fruit and vegetables to improve health. Due to the content of essential compounds such as antioxidants, important amounts of fruits and vegetables should be included in the daily diet. Juices are good sources of vitamins and can also help increase overall fruit and vegetable consumption. Starting from this trend (introduction into the daily diet of vegetables and fruits) as well as the desire to diversify the range of functional products for both adults and children, a fermented juice was made using probiotic microorganisms based on root vegetables, with potential beneficial effects in the diet of children, vegetarians and people with lactose intolerance. The three vegetables selected for this study, red beet, carrot, and celery bring a significant contribution to functional compounds such as carotenoids, flavonoids, betalain, vitamin B and C, minerals and fiber. By fermentation, the functional value of the vegetable juice increases due to the improved stability of these compounds. The combination of probiotic microorganisms and vegetable fibers resulted in a nutrient-rich synbiotic product. The stability of the nutritional and sensory qualities of the obtained synbiotic product has been tested throughout its shelf life. The evaluation of the physico-chemical changes of the synbiotic drink during storage confirmed that: (i) vegetable juice enriched with honey and vegetable pulp is an important source of nutritional compounds, especially carbohydrates and fiber; (ii) microwave treatment used to inhibit pathogenic microflora did not significantly affect nutritional compounds in vegetable juice, vitamin C concentration remained at baseline and beta-carotene concentration increased due to increased bioavailability; (iii) fermentation has improved the nutritional quality of vegetable juice by increasing the content of B vitamins, polyphenols and flavonoids and has a good antioxidant capacity throughout the shelf life; (iv) the FTIR and Raman spectra have highlighted the results obtained using physicochemical methods. Based on the analysis of IR absorption frequencies, the most striking bands belong to the frequencies 3330 cm⁻¹, 1636 cm⁻¹ and 1050 cm⁻¹, specific for groups of compounds such as polyphenols, carbohydrates, fatty acids, and proteins. Statistical data processing revealed a good correlation between the content of flavonoids, betalain, β-carotene, ascorbic acid and polyphenols, the fermented juice having a stable antioxidant activity. Also, principal components analysis showed that there was a negative correlation between the evolution of the concentration of B vitamins and antioxidant activity. Acknowledgment: This study has been founded by the Francophone University Agency, Project Réseau régional dans le domaine de la santé, la nutrition et la sécurité alimentaire (SaIN), No. at Dunarea de Jos University of Galati 21899/ 06.09.2017 and by the Sectorial Operational Programme Human Resources Development of the Romanian Ministry of Education, Research, Youth and Sports trough the Financial Agreement POSDRU/159/1.5/S/132397 ExcelDOC.

Keywords: bioactive compounds, fermentation, synbiotic drink from vegetables, stability during storage

Procedia PDF Downloads 150
55 Big Data Applications for Transportation Planning

Authors: Antonella Falanga, Armando Cartenì

Abstract:

"Big data" refers to extremely vast and complex sets of data, encompassing extraordinarily large and intricate datasets that require specific tools for meaningful analysis and processing. These datasets can stem from diverse origins like sensors, mobile devices, online transactions, social media platforms, and more. The utilization of big data is pivotal, offering the chance to leverage vast information for substantial advantages across diverse fields, thereby enhancing comprehension, decision-making, efficiency, and fostering innovation in various domains. Big data, distinguished by its remarkable attributes of enormous volume, high velocity, diverse variety, and significant value, represent a transformative force reshaping the industry worldwide. Their pervasive impact continues to unlock new possibilities, driving innovation and advancements in technology, decision-making processes, and societal progress in an increasingly data-centric world. The use of these technologies is becoming more widespread, facilitating and accelerating operations that were once much more complicated. In particular, big data impacts across multiple sectors such as business and commerce, healthcare and science, finance, education, geography, agriculture, media and entertainment and also mobility and logistics. Within the transportation sector, which is the focus of this study, big data applications encompass a wide variety, spanning across optimization in vehicle routing, real-time traffic management and monitoring, logistics efficiency, reduction of travel times and congestion, enhancement of the overall transportation systems, but also mitigation of pollutant emissions contributing to environmental sustainability. Meanwhile, in public administration and the development of smart cities, big data aids in improving public services, urban planning, and decision-making processes, leading to more efficient and sustainable urban environments. Access to vast data reservoirs enables deeper insights, revealing hidden patterns and facilitating more precise and timely decision-making. Additionally, advancements in cloud computing and artificial intelligence (AI) have further amplified the potential of big data, enabling more sophisticated and comprehensive analyses. Certainly, utilizing big data presents various advantages but also entails several challenges regarding data privacy and security, ensuring data quality, managing and storing large volumes of data effectively, integrating data from diverse sources, the need for specialized skills to interpret analysis results, ethical considerations in data use, and evaluating costs against benefits. Addressing these difficulties requires well-structured strategies and policies to balance the benefits of big data with privacy, security, and efficient data management concerns. Building upon these premises, the current research investigates the efficacy and influence of big data by conducting an overview of the primary and recent implementations of big data in transportation systems. Overall, this research allows us to conclude that big data better provide to enhance rational decision-making for mobility choices and is imperative for adeptly planning and allocating investments in transportation infrastructures and services.

Keywords: big data, public transport, sustainable mobility, transport demand, transportation planning

Procedia PDF Downloads 60
54 Differential Expression Profile Analysis of DNA Repair Genes in Mycobacterium Leprae by qPCR

Authors: Mukul Sharma, Madhusmita Das, Sundeep Chaitanya Vedithi

Abstract:

Leprosy is a chronic human disease caused by Mycobacterium leprae, that cannot be cultured in vitro. Though treatable with multidrug therapy (MDT), recently, bacteria reported resistance to multiple antibiotics. Targeting DNA replication and repair pathways can serve as the foundation of developing new anti-leprosy drugs. Due to the absence of an axenic culture medium for the propagation of M. leprae, studying cellular processes, especially those belonging to DNA repair pathways, is challenging. Genomic understanding of M. Leprae harbors several protein-coding genes with no previously assigned function known as 'hypothetical proteins'. Here, we report identification and expression of known and hypothetical DNA repair genes from a human skin biopsy and mouse footpads that are involved in base excision repair, direct reversal repair, and SOS response. Initially, a bioinformatics approach was employed based on sequence similarity, identification of known protein domains to screen the hypothetical proteins in the genome of M. leprae, that are potentially related to DNA repair mechanisms. Before testing on clinical samples, pure stocks of bacterial reference DNA of M. leprae (NHDP63 strain) was used to construct standard graphs to validate and identify lower detection limit in the qPCR experiments. Primers were designed to amplify the respective transcripts, and PCR products of the predicted size were obtained. Later, excisional skin biopsies of newly diagnosed untreated, treated, and drug resistance leprosy cases from SIHR & LC hospital, Vellore, India were taken for the extraction of RNA. To determine the presence of the predicted transcripts, cDNA was generated from M. leprae mRNA isolated from clinically confirmed leprosy skin biopsy specimen across all the study groups. Melting curve analysis was performed to determine the integrity of the amplification and to rule out primer‑dimer formation. The Ct values obtained from qPCR were fitted to standard curve to determine transcript copy number. Same procedure was applied for M. leprae extracted after processing a footpad of nude mice of drug sensitive and drug resistant strains. 16S rRNA was used as positive control. Of all the 16 genes involved in BER, DR, and SOS, differential expression pattern of the genes was observed in terms of Ct values when compared to human samples; this was because of the different host and its immune response. However, no drastic variation in gene expression levels was observed in human samples except the nth gene. The higher expression of nth gene could be because of the mutations that may be associated with sequence diversity and drug resistance which suggests an important role in the repair mechanism and remains to be explored. In both human and mouse samples, SOS system – lexA and RecA, and BER genes AlkB and Ogt were expressing efficiently to deal with possible DNA damage. Together, the results of the present study suggest that DNA repair genes are constitutively expressed and may provide a reference for molecular diagnosis, therapeutic target selection, determination of treatment and prognostic judgment in M. leprae pathogenesis.

Keywords: DNA repair, human biopsy, hypothetical proteins, mouse footpads, Mycobacterium leprae, qPCR

Procedia PDF Downloads 103
53 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit

Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic

Abstract:

Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.

Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method

Procedia PDF Downloads 119
52 Describing Cognitive Decline in Alzheimer's Disease via a Picture Description Writing Task

Authors: Marielle Leijten, Catherine Meulemans, Sven De Maeyer, Luuk Van Waes

Abstract:

For the diagnosis of Alzheimer's disease (AD), a large variety of neuropsychological tests are available. In some of these tests, linguistic processing - both oral and written - is an important factor. Language disturbances might serve as a strong indicator for an underlying neurodegenerative disorder like AD. However, the current diagnostic instruments for language assessment mainly focus on product measures, such as text length or number of errors, ignoring the importance of the process that leads to written or spoken language production. In this study, it is our aim to describe and test differences between cognitive and impaired elderly on the basis of a selection of writing process variables (inter- and intrapersonal characteristics). These process variables are mainly related to pause times, because the number, length, and location of pauses have proven to be an important indicator of the cognitive complexity of a process. Method: Participants that were enrolled in our research were chosen on the basis of a number of basic criteria necessary to collect reliable writing process data. Furthermore, we opted to match the thirteen cognitively impaired patients (8 MCI and 5 AD) with thirteen cognitively healthy elderly. At the start of the experiment, participants were each given a number of tests, such as the Mini-Mental State Examination test (MMSE), the Geriatric Depression Scale (GDS), the forward and backward digit span and the Edinburgh Handedness Inventory (EHI). Also, a questionnaire was used to collect socio-demographic information (age, gender, eduction) of the subjects as well as more details on their level of computer literacy. The tests and questionnaire were followed by two typing tasks and two picture description tasks. For the typing tasks participants had to copy (type) characters, words and sentences from a screen, whereas the picture description tasks each consisted of an image they had to describe in a few sentences. Both the typing and the picture description tasks were logged with Inputlog, a keystroke logging tool that allows us to log and time stamp keystroke activity to reconstruct and describe text production processes. The main rationale behind keystroke logging is that writing fluency and flow reveal traces of the underlying cognitive processes. This explains the analytical focus on pause (length, number, distribution, location, etc.) and revision (number, type, operation, embeddedness, location, etc.) characteristics. As in speech, pause times are seen as indexical of cognitive effort. Results. Preliminary analysis already showed some promising results concerning pause times before, within and after words. For all variables, mixed effects models were used that included participants as a random effect and MMSE scores, GDS scores and word categories (such as determiners and nouns) as a fixed effect. For pause times before and after words cognitively impaired patients paused longer than healthy elderly. These variables did not show an interaction effect between the group participants (cognitively impaired or healthy elderly) belonged to and word categories. However, pause times within words did show an interaction effect, which indicates pause times within certain word categories differ significantly between patients and healthy elderly.

Keywords: Alzheimer's disease, keystroke logging, matching, writing process

Procedia PDF Downloads 366
51 Geochemical Evaluation of Metal Content and Fluorescent Characterization of Dissolved Organic Matter in Lake Sediments

Authors: Fani Sakellariadou, Danae Antivachis

Abstract:

Purpose of this paper is to evaluate the environmental status of a coastal Mediterranean lake, named Koumoundourou, located in the northeastern coast of Elefsis Bay, in the western region of Attiki in Greece, 15 km far from Athens. It is preserved from ancient times having an important archaeological interest. Koumoundourou lake is also considered as a valuable wetland accommodating an abundant flora and fauna, with a variety of bird species including a few world’s threatened ones. Furthermore, it is a heavily modified lake, affected by various anthropogenic pollutant sources which provide industrial, urban and agricultural contaminants. The adjacent oil refineries and the military depot are the major pollution providers furnishing with crude oil spills and leaks. Moreover, the lake accepts a quantity of groundwater leachates from the major landfill of Athens. The environmental status of the lake results from the intensive land uses combined with the permeable lithology of the surrounding area and the existence of karstic springs which discharge calcareous mountains. Sediment samples were collected along the shoreline of the lake using a Van Veen grab stainless steel sampler. They were studied for the determination of the total metal content and the metal fractionation in geochemical phases as well as the characterization of the dissolved organic matter (DOM). These constituents have a significant role in the ecological consideration of the lake. Metals may be responsible for harmful environmental impacts. The metal partitioning offers comprehensive information for the origin, mode of occurrence, biological and physicochemical availability, mobilization and transport of metals. Moreover, DOM has a multifunctional importance interacting with inorganic and organic contaminants leading to biogeochemical and ecological effects. The samples were digested using microwave heating with a suitable laboratory microwave unit. For the total metal content, the samples were treated with a mixture of strong acids. Then, a sequential extraction procedure was applied for the removal of exchangeable, carbonate hosted, reducible, organic/sulphides and residual fractions. Metal content was determined by an ICP-MS (Perkin Elmer, ICP MASS Spectrophotometer NexION 350D). Furthermore, the DOM was removed via a gentle extraction procedure and then it was characterized by fluorescence spectroscopy using a Perkin-Elmer LS 55 luminescence spectrophotometer equipped with the WinLab 4.00.02 software for data processing (Agilent, Cary Eclipse Fluorescence). Mono dimensional emission, excitation, synchronous-scan excitation and total luminescence spectra were recorded for the classification of chromophoric units present in the aqueous extracts. Total metal concentrations were determined and compared with those of the Elefsis gulf sediments. Element partitioning showed the anthropogenic sources and the contaminant bioavailability. All fluorescence spectra, as well as humification indices, were evaluated in detail to find out the nature and origin of DOM. All the results were compared and interpreted to evaluate the environmental quality of Koumoundourou lake and the need for environmental management and protection.

Keywords: anthropogenic contaminant, dissolved organic matter, lake, metal, pollution

Procedia PDF Downloads 157
50 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence

Authors: Gus Calderon, Richard McCreight, Tammy Schwartz

Abstract:

Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.

Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.

Procedia PDF Downloads 108
49 Influence of Bacterial Biofilm on the Corrosive Processes in Electronic Equipment

Authors: Iryna P. Dzieciuch, Michael D. Putman

Abstract:

Humidity is known to degrade Navy ship electronic equipment, especially in hot moist environments. If left untreated, it can cause significant and permanent damage. Even rigorous inspection and frequent clean-up would not prevent further equipment contamination and degradation because of the constant presence of favorable growth conditions for many microorganisms. Generally, relative humidity levels of less than 60% will inhibit corrosion in electronic equipment, but because NAVY electronics often operate in hot and humid environments, prevention via dehumidification is not always possible. Currently, there is no defined research that fully describes key mechanisms which cause electronics and its coating degradation. The corrosive action of most bacteria is mainly developed through (i) mycelium adherence to the metal plates, (ii) facilitation the formation of pitting areas, (iii) production of organic acids such as citric, iso-citric, cis-aconitic, alpha-ketoglutaric, which are corrosive to electronic equipment and its components. Our approach studies corrosive action in electronic equipment: circuit-board, wires and connections that are exposed in the humid environment that gets worse during condensation. In our new approach the technical task is built on work with the bacterial communities in public areas, bacterial genetics, bioinformatics, biostatistics and Scanning Electron Microscopy (SEM) of corroded circuit boards. Based on these methods, we collect and examine environmental samples from biofilms of the corroded and non-corroded sites, where bacterial contamination of electronic equipment, such as machine racks and shore boats, is an ongoing concern. Sample collection and sample analysis is focused on addressing the key questions identified above through the following tasks: laboratory sample processing and evaluation under scanning electron microscopy, initial sequencing and data evaluation; bioinformatics and data analysis. Preliminary results from scanning electron microscopy (SEM) have revealed that metal particulates and alloys in corroded samples consists mostly of Tin ( < 40%), Silicon ( < 4%), Sulfur ( < 1%), Aluminum ( < 2%), Magnesium ( < 2%), Copper ( < 1%), Bromine ( < 2%), Barium ( <1%) and Iron ( < 2%) elements. We have also performed X 12000 magnification of the same sites and that proved existence of undisrupted biofilm organelles and crystal structures. Non-corrosion sites have revealed high presence of copper ( < 47%); other metals remain at the comparable level as on the samples with corrosion. We have performed X 1000 magnification on the non-corroded at the sites and have documented formation of copper crystals. The next step of this study, is to perform metagenomics sequencing at all sites and to compare bacterial composition present in the environment. While copper is nontoxic to the living organisms, the process of bacterial adhesion creates acidic environment by releasing citric, iso-citric, cis-aconitic, alpha-ketoglutaric acidics, which in turn release copper ions Cu++, which that are highly toxic to the bacteria and higher order living organisms. This phenomenon, might explain natural “antibiotic” properties that are lacking in elements such as tin. To prove or deny this hypothesis we will use next - generation sequencing (NGS) methods to investigate types and growth cycles of bacteria that from bacterial biofilm the on corrosive and non-corrosive samples.

Keywords: bacteria, biofilm, circuit board, copper, corrosion, electronic equipment, organic acids, tin

Procedia PDF Downloads 160
48 Strategy to Evaluate Health Risks of Short-Term Exposure of Air Pollution in Vulnerable Individuals

Authors: Sarah Nauwelaerts, Koen De Cremer, Alfred Bernard, Meredith Verlooy, Kristel Heremans, Natalia Bustos Sierra, Katrien Tersago, Tim Nawrot, Jordy Vercauteren, Christophe Stroobants, Sigrid C. J. De Keersmaecker, Nancy Roosens

Abstract:

Projected climate changes could lead to exacerbation of respiratory disorders associated with reduced air quality. Air pollution and climate changes influence each other through complex interactions. The poor air quality in urban and rural areas includes high levels of particulate matter (PM), ozone (O3) and nitrogen oxides (NOx), representing a major threat to public health and especially for the most vulnerable population strata, and especially young children. In this study, we aim to develop generic standardized policy supporting tools and methods that allow evaluating in future follow-up larger scale epidemiological studies the risks of the combined short-term effects of O3 and PM on the cardiorespiratory system of children. We will use non-invasive indicators of airway damage/inflammation and of genetic or epigenetic variations by using urine or saliva as alternative to blood samples. Therefore, a multi-phase field study will be organized in order to assess the sensitivity and applicability of these tests in large cohorts of children during episodes of air pollution. A first test phase was planned in March 2018, not yet taking into account ‘critical’ pollution periods. Working with non-invasive samples, choosing the right set-up for the field work and the volunteer selection were parameters to consider, as they significantly influence the feasibility of this type of study. During this test phase, the selection of the volunteers was done in collaboration with medical doctors from the Centre for Student Assistance (CLB), by choosing a class of pre-pubertal children of 9-11 years old in a primary school in Flemish Brabant, Belgium. A questionnaire, collecting information on the health and background of children and an informed consent document were drawn up for the parents as well as a simplified cartoon-version of this document for the children. A detailed study protocol was established, giving clear information on the study objectives, the recruitment, the sample types, the medical examinations to be performed, the strategy to ensure anonymity, and finally on the sample processing. Furthermore, the protocol describes how this field study will be conducted in relation with the prevision and monitoring of air pollutants for the future phases. Potential protein, genetic and epigenetic biomarkers reflecting the respiratory function and the levels of air pollution will be measured in the collected samples using unconventional technologies. The test phase results will be used to address the most important bottlenecks before proceeding to the following phases of the study where the combined effect of O3 and PM during pollution peaks will be examined. This feasibility study will allow identifying possible bottlenecks and providing missing scientific knowledge, necessary for the preparation, implementation and evaluation of federal policies/strategies, based on the most appropriate epidemiological studies on the health effects of air pollution. The research leading to these results has been funded by the Belgian Science Policy Office through contract No.: BR/165/PI/PMOLLUGENIX-V2.

Keywords: air pollution, biomarkers, children, field study, feasibility study, non-invasive

Procedia PDF Downloads 176
47 SWOT Analysis on the Prospects of Carob Use in Human Nutrition: Crete, Greece

Authors: Georgios A. Fragkiadakis, Antonia Psaroudaki, Theodora Mouratidou, Eirini Sfakianaki

Abstract:

Research: Within the project "Actions for the optimal utilization of the potential of carob in the Region of Crete" which is financed-supervised by the Region, with collaboration of Crete University and Hellenic Mediterranean University, a SWOT (strengths, weaknesses, opportunities, threats) survey was carried out, to evaluate the prospects of carob in human nutrition, in Crete. Results and conclusions: 1). Strengths: There exists a local production of carob for human consumption, based on international reports, and local-product reports. The data on products in the market (over 100 brands of carob food), indicates a sufficiency of carob materials offered in Crete. The variety of carob food products retailed in Crete indicates a strong demand-production-consumption trend. There is a stable number (core) of businesses that invest significantly (Creta carob, Cretan mills, etc.). The great majority of the relevant food stores (bakery, confectionary etc.) do offer carob products. The presence of carob products produced in Crete is strong on the internet (over 20 main professionally designed websites). The promotion of the carob food-products is based on their variety and on a few historical elements connected with the Cretan diet. 2). Weaknesses: The international prices for carob seed affect the sector; the seed had an international price of €20 per kg in 2021-22 and fell to €8 in 2022, causing losses to carob traders. The local producers do not sort the carobs they deliver for processing, causing 30-40% losses of the product in the industry. The occasional high price triggers the collection of degraded raw material; large losses may emerge due to the action of insects. There are many carob trees whose fruits are not collected, e.g. in Apokoronas, Chania. The nutritional and commercial value of the wild carob fruits is very low. Carob trees-production is recorded by Greek statistical services as "other cultures" in combination with prickly pear i.e., creating difficulties in retrieving data. The percentage of carob used for human nutrition, in contrast to animal feeding, is not known. The exact imports of carob are not closely monitored. We have no data on the recycling of carob by-products in Crete. 3). Opportunities: The development of a culture of respect for carob trade may improve professional relations in the sector. Monitoring carob market and connecting production with retailing-industry needs may allow better market-stability. Raw material evaluation procedures may be implemented to maintain carob value-chain. The state agricultural services may be further involved in carob-health protection. The education of farmers on carob cultivation/management, can improve the quality of the product. The selection of local productive varieties, may improve the sustainability of the culture. Connecting the consumption of carob with health-food products, may create added value in the sector. The presence and extent of wild carob threes in Crete, represents, potentially, a target for grafting. 4). Threats: The annual fluctuation of carob yield challenges the programming of local food industry activities. Carob is a forest species also - there is danger of wrong classification of crops as forest areas, where land ownership is not clear.

Keywords: human nutrition, carob food, SWOT analysis, crete, greece

Procedia PDF Downloads 91
46 Improving the Utility of Social Media in Pharmacovigilance: A Mixed Methods Study

Authors: Amber Dhoot, Tarush Gupta, Andrea Gurr, William Jenkins, Sandro Pietrunti, Alexis Tang

Abstract:

Background: The COVID-19 pandemic has driven pharmacovigilance towards a new paradigm. Nowadays, more people than ever before are recognising and reporting adverse reactions from medications, treatments, and vaccines. In the modern era, with over 3.8 billion users, social media has become the most accessible medium for people to voice their opinions and so provides an opportunity to engage with more patient-centric and accessible pharmacovigilance. However, the pharmaceutical industry has been slow to incorporate social media into its modern pharmacovigilance strategy. This project aims to make social media a more effective tool in pharmacovigilance, and so reduce drug costs, improve drug safety and improve patient outcomes. This will be achieved by firstly uncovering and categorising the barriers facing the widespread adoption of social media in pharmacovigilance. Following this, the potential opportunities of social media will be explored. We will then propose realistic, practical recommendations to make social media a more effective tool for pharmacovigilance. Methodology: A comprehensive systematic literature review was conducted to produce a categorised summary of these barriers. This was followed by conducting 11 semi-structured interviews with pharmacovigilance experts to confirm the literature review findings whilst also exploring the unpublished and real-life challenges faced by those in the pharmaceutical industry. Finally, a survey of the general public (n = 112) ascertained public knowledge, perception, and opinion regarding the use of their social media data for pharmacovigilance purposes. This project stands out by offering perspectives from the public and pharmaceutical industry that fill the research gaps identified in the literature review. Results: Our results gave rise to several key analysis points. Firstly, inadequacies of current Natural Language Processing algorithms hinder effective pharmacovigilance data extraction from social media, and where data extraction is possible, there are significant questions over its quality. Social media also contains a variety of biases towards common drugs, mild adverse drug reactions, and the younger generation. Additionally, outdated regulations for social media pharmacovigilance do not align with new, modern General Data Protection Regulations (GDPR), creating ethical ambiguity about data privacy and level of access. This leads to an underlying mindset of avoidance within the pharmaceutical industry, as firms are disincentivised by the legal, financial, and reputational risks associated with breaking ambiguous regulations. Conclusion: Our project uncovered several barriers that prevent effective pharmacovigilance on social media. As such, social media should be used to complement traditional sources of pharmacovigilance rather than as a sole source of pharmacovigilance data. However, this project adds further value by proposing five practical recommendations that improve the effectiveness of social media pharmacovigilance. These include: prioritising health-orientated social media; improving technical capabilities through investment and strategic partnerships; setting clear regulatory guidelines using multi-stakeholder processes; creating an adverse drug reaction reporting interface inbuilt into social media platforms; and, finally, developing educational campaigns to raise awareness of the use of social media in pharmacovigilance. Implementation of these recommendations would speed up the efficient, ethical, and systematic adoption of social media in pharmacovigilance.

Keywords: adverse drug reaction, drug safety, pharmacovigilance, social media

Procedia PDF Downloads 81
45 The Negative Effects of Controlled Motivation on Mathematics Achievement

Authors: John E. Boberg, Steven J. Bourgeois

Abstract:

The decline in student engagement and motivation through the middle years is well documented and clearly associated with a decline in mathematics achievement that persists through high school. To combat this trend and, very often, to meet high-stakes accountability standards, a growing number of parents, teachers, and schools have implemented various methods to incentivize learning. However, according to Self-Determination Theory, forms of incentivized learning such as public praise, tangible rewards, or threats of punishment tend to undermine intrinsic motivation and learning. By focusing on external forms of motivation that thwart autonomy in children, adults also potentially threaten relatedness measures such as trust and emotional engagement. Furthermore, these controlling motivational techniques tend to promote shallow forms of cognitive engagement at the expense of more effective deep processing strategies. Therefore, any short-term gains in apparent engagement or test scores are overshadowed by long-term diminished motivation, resulting in inauthentic approaches to learning and lower achievement. The current study focuses on the relationships between student trust, engagement, and motivation during these crucial years as students transition from elementary to middle school. In order to test the effects of controlled motivational techniques on achievement in mathematics, this quantitative study was conducted on a convenience sample of 22 elementary and middle schools from a single public charter school district in the south-central United States. The study employed multi-source data from students (N = 1,054), parents (N = 7,166), and teachers (N = 356), along with student achievement data and contextual campus variables. Cross-sectional questionnaires were used to measure the students’ self-regulated learning, emotional and cognitive engagement, and trust in teachers. Parents responded to a single item on incentivizing the academic performance of their child, and teachers responded to a series of questions about their acceptance of various incentive strategies. Structural equation modeling (SEM) was used to evaluate model fit and analyze the direct and indirect effects of the predictor variables on achievement. Although a student’s trust in teacher positively predicted both emotional and cognitive engagement, none of these three predictors accounted for any variance in achievement in mathematics. The parents’ use of incentives, on the other hand, predicted a student’s perception of his or her controlled motivation, and these two variables had significant negative effects on achievement. While controlled motivation had the greatest effects on achievement, parental incentives demonstrated both direct and indirect effects on achievement through the students’ self-reported controlled motivation. Comparing upper elementary student data with middle-school student data revealed that controlling forms of motivation may be taking their toll on student trust and engagement over time. While parental incentives positively predicted both cognitive and emotional engagement in the younger sub-group, such forms of controlling motivation negatively predicted both trust in teachers and emotional engagement in the middle-school sub-group. These findings support the claims, posited by Self-Determination Theory, about the dangers of incentivizing learning. Short-term gains belie the underlying damage to motivational processes that lead to decreased intrinsic motivation and achievement. Such practices also appear to thwart basic human needs such as relatedness.

Keywords: controlled motivation, student engagement, incentivized learning, mathematics achievement, self-determination theory, student trust

Procedia PDF Downloads 219
44 Supporting 'vulnerable' Students to Complete Their Studies During the Economic Crisis in Greece: The Umbrella Program of International Hellenic University

Authors: Rigas Kotsakis, Nikolaos Tsigilis, Vasilis Grammatikopoulos, Evridiki Zachopoulou

Abstract:

During the last decade, Greece has faced an unprecedented financial crisis, affecting various aspects and functionalities of Higher Education. Besides the restricted funding of academic institutions, the students and their families encountered economical difficulties that undoubtedly influenced the effective completion of their studies. In this context, a fairly large number of students in Alexander campus of International Hellenic University (IHU) delay, interrupt, or even abandon their studies, especially when they come from low-income families, belong to sensitive social or special needs groups, they have different cultural origins, etc. For this reason, a European project, named “Umbrella”, was initiated aiming at providing the necessary psychological support and counseling, especially to disadvantaged students, towards the completion of their studies. To this end, a network of various academic members (academic staff and students) from IHU, namely iMentor, were implicated in different roles. Specifically, experienced academic staff trained students to serve as intermediate links for the integration and educational support of students that fall into the aforementioned sensitive social groups and face problems for the completion of their studies. The main idea of the project is held upon its person-centered character, which facilitates direct student-to-student communication without the intervention of the teaching staff. The backbone of the iMentors network are senior students that face no problem in their academic life and volunteered for this project. It should be noted that there is a provision from the Umbrella structure for substantial and ethical rewards for their engagement. In this context, a well-defined, stringent methodology was implemented for the evaluation of the extent of the problem in IHU and the detection of the profile of the “candidate” disadvantaged students. The first phase included two steps, (a) data collection and (b) data cleansing/ preprocessing. The first step involved the data collection process from the Secretary Services of all Schools in IHU, from 1980 to 2019, which resulted in 96.418 records. The data set included the School name, the semester of studies, a student enrolling criteria, the nationality, the graduation year or the current, up-to-date academic state (still studying, delayed, dropped off, etc.). The second step of the employed methodology involved the data cleansing/preprocessing because of the existence of “noisy” data, missing and erroneous values, etc. Furthermore, several assumptions and grouping actions were imposed to achieve data homogeneity and an easy-to-interpret subsequent statistical analysis. Specifically, the duration of 40 years recording was limited to the last 15 years (2004-2019). In 2004 the Greek Technological Institutions were evolved into Higher Education Universities, leading into a stable and unified frame of graduate studies. In addition, the data concerning active students were excluded from the analysis since the initial processing effort was focused on the detection of factors/variables that differentiated graduate and deleted students. The final working dataset included 21.432 records with only two categories of students, those that have a degree and those who abandoned their studies. Findings of the first phase are presented across faculties and further discussed.

Keywords: higher education, students support, economic crisis, mentoring

Procedia PDF Downloads 114
43 Encapsulated Bioflavonoids: Nanotechnology Driven Food Waste Utilization

Authors: Niharika Kaushal, Minni Singh

Abstract:

Citrus fruits fall into the category of those commercially grown fruits that constitute an excellent repository of phytochemicals with health-promoting properties. Fruits belonging to the citrus family, when processed by industries, produce tons of agriculture by-products in the form of peels, pulp, and seeds, which normally have no further usage and are commonly discarded. In spite of this, such residues are of paramount importance due to their richness in valuable compounds; therefore, agro-waste is considered a valuable bioresource for various purposes in the food sector. A range of biological properties, including anti-oxidative, anti-cancerous, anti-inflammatory, anti-allergenicity, and anti-aging activity, have been reported for these bioactive compounds. Taking advantage of these inexpensive residual sources requires special attention to extract bioactive compounds. Mandarin (Citrus nobilis X Citrus deliciosa) is a potential source of bioflavonoids with antioxidant properties, and it is increasingly regarded as a functional food. Despite these benefits, flavonoids suffer from a barrier of pre-systemic metabolism in gastric fluid, which impedes their effectiveness. Therefore, colloidal delivery systems can completely overcome the barrier in question. This study involved the extraction and identification of key flavonoids from mandarin biomass. Using a green chemistry approach, supercritical fluid extraction at 330 bar, temperature 40C, and co-solvent 10% ethanol was employed for extraction, and the identification of flavonoids was made by mass spectrometry. As flavonoids are concerned with a limitation, the obtained extract was encapsulated in polylactic-co-glycolic acid (PLGA) matrix using a solvent evaporation method. Additionally, the antioxidant potential was evaluated by the 2,2-diphenylpicrylhydrazyl (DPPH) assay. A release pattern of flavonoids was observed over time using simulated gastrointestinal fluids. From the results, it was observed that the total flavonoids extracted from the mandarin biomass were estimated to be 47.3 ±1.06 mg/ml rutin equivalents as total flavonoids. In the extract, significantly, polymethoxyflavones (PMFs), tangeretin and nobiletin were identified, followed by hesperetin and naringin. The designed flavonoid-PLGA nanoparticles exhibited a particle size between 200-250nm. In addition, the bioengineered nanoparticles had a high entrapment efficiency of nearly 80.0% and maintained stability for more than a year. Flavonoid nanoparticles showed excellent antioxidant activity with an IC50 of 0.55μg/ml. Morphological studies revealed the smooth and spherical shape of nanoparticles as visualized by Field emission scanning electron microscopy (FE-SEM). Simulated gastrointestinal studies of free extract and nanoencapsulation revealed the degradation of nearly half of the flavonoids under harsh acidic conditions in the case of free extract. After encapsulation, flavonoids exhibited sustained release properties, suggesting that polymeric encapsulates are efficient carriers of flavonoids. Thus, such technology-driven and biomass-derived products form the basis for their use in the development of functional foods with improved therapeutic potential and antioxidant properties. As a result, citrus processing waste can be considered a new resource that has high value and can be used for promoting its utilization.

Keywords: citrus, agrowaste, flavonoids, nanoparticles

Procedia PDF Downloads 129
42 Gene Expression Meta-Analysis of Potential Shared and Unique Pathways Between Autoimmune Diseases Under anti-TNFα Therapy

Authors: Charalabos Antonatos, Mariza Panoutsopoulou, Georgios K. Georgakilas, Evangelos Evangelou, Yiannis Vasilopoulos

Abstract:

The extended tissue damage and severe clinical outcomes of autoimmune diseases, accompanied by the high annual costs to the overall health care system, highlight the need for an efficient therapy. Increasing knowledge over the pathophysiology of specific chronic inflammatory diseases, namely Psoriasis (PsO), Inflammatory Bowel Diseases (IBD) consisting of Crohn’s disease (CD) and Ulcerative colitis (UC), and Rheumatoid Arthritis (RA), has provided insights into the underlying mechanisms that lead to the maintenance of the inflammation, such as Tumor Necrosis Factor alpha (TNF-α). Hence, the anti-TNFα biological agents pose as an ideal therapeutic approach. Despite the efficacy of anti-TNFα agents, several clinical trials have shown that 20-40% of patients do not respond to treatment. Nowadays, high-throughput technologies have been recruited in order to elucidate the complex interactions in multifactorial phenotypes, with the most ubiquitous ones referring to transcriptome quantification analyses. In this context, a random effects meta-analysis of available gene expression cDNA microarray datasets was performed between responders and non-responders to anti-TNFα therapy in patients with IBD, PsO, and RA. Publicly available datasets were systematically searched from inception to 10th of November 2020 and selected for further analysis if they assessed the response to anti-TNFα therapy with clinical score indexes from inflamed biopsies. Specifically, 4 IBD (79 responders/72 non-responders), 3 PsO (40 responders/11 non-responders) and 2 RA (16 responders/6 non-responders) datasetswere selected. After the separate pre-processing of each dataset, 4 separate meta-analyses were conducted; three disease-specific and a single combined meta-analysis on the disease-specific results. The MetaVolcano R package (v.1.8.0) was utilized for a random-effects meta-analysis through theRestricted Maximum Likelihood (RELM) method. The top 1% of the most consistently perturbed genes in the included datasets was highlighted through the TopConfects approach while maintaining a 5% False Discovery Rate (FDR). Genes were considered as Differentialy Expressed (DEGs) as those with P ≤ 0.05, |log2(FC)| ≥ log2(1.25) and perturbed in at least 75% of the included datasets. Over-representation analysis was performed using Gene Ontology and Reactome Pathways for both up- and down-regulated genes in all 4 performed meta-analyses. Protein-Protein interaction networks were also incorporated in the subsequentanalyses with STRING v11.5 and Cytoscape v3.9. Disease-specific meta-analyses detected multiple distinct pro-inflammatory and immune-related down-regulated genes for each disease, such asNFKBIA, IL36, and IRAK1, respectively. Pathway analyses revealed unique and shared pathways between each disease, such as Neutrophil Degranulation and Signaling by Interleukins. The combined meta-analysis unveiled 436 DEGs, 86 out of which were up- and 350 down-regulated, confirming the aforementioned shared pathways and genes, as well as uncovering genes that participate in anti-inflammatory pathways, namely IL-10 signaling. The identification of key biological pathways and regulatory elements is imperative for the accurate prediction of the patient’s response to biological drugs. Meta-analysis of such gene expression data could aid the challenging approach to unravel the complex interactions implicated in the response to anti-TNFα therapy in patients with PsO, IBD, and RA, as well as distinguish gene clusters and pathways that are altered through this heterogeneous phenotype.

Keywords: anti-TNFα, autoimmune, meta-analysis, microarrays

Procedia PDF Downloads 180
41 The Use of Artificial Intelligence in the Context of a Space Traffic Management System: Legal Aspects

Authors: George Kyriakopoulos, Photini Pazartzis, Anthi Koskina, Crystalie Bourcha

Abstract:

The need for securing safe access to and return from outer space, as well as ensuring the viability of outer space operations, maintains vivid the debate over the promotion of organization of space traffic through a Space Traffic Management System (STM). The proliferation of outer space activities in recent years as well as the dynamic emergence of the private sector has gradually resulted in a diverse universe of actors operating in outer space. The said developments created an increased adverse impact on outer space sustainability as the case of the growing number of space debris clearly demonstrates. The above landscape sustains considerable threats to outer space environment and its operators that need to be addressed by a combination of scientific-technological measures and regulatory interventions. In this context, recourse to recent technological advancements and, in particular, to Artificial Intelligence (AI) and machine learning systems, could achieve exponential results in promoting space traffic management with respect to collision avoidance as well as launch and re-entry procedures/phases. New technologies can support the prospects of a successful space traffic management system at an international scale by enabling, inter alia, timely, accurate and analytical processing of large data sets and rapid decision-making, more precise space debris identification and tracking and overall minimization of collision risks and reduction of operational costs. What is more, a significant part of space activities (i.e. launch and/or re-entry phase) takes place in airspace rather than in outer space, hence the overall discussion also involves the highly developed, both technically and legally, international (and national) Air Traffic Management System (ATM). Nonetheless, from a regulatory perspective, the use of AI for the purposes of space traffic management puts forward implications that merit particular attention. Key issues in this regard include the delimitation of AI-based activities as space activities, the designation of the applicable legal regime (international space or air law, national law), the assessment of the nature and extent of international legal obligations regarding space traffic coordination, as well as the appropriate liability regime applicable to AI-based technologies when operating for space traffic coordination, taking into particular consideration the dense regulatory developments at EU level. In addition, the prospects of institutionalizing international cooperation and promoting an international governance system, together with the challenges of establishment of a comprehensive international STM regime are revisited in the light of intervention of AI technologies. This paper aims at examining regulatory implications advanced by the use of AI technology in the context of space traffic management operations and its key correlating concepts (SSA, space debris mitigation) drawing in particular on international and regional considerations in the field of STM (e.g. UNCOPUOS, International Academy of Astronautics, European Space Agency, among other actors), the promising advancements of the EU approach to AI regulation and, last but not least, national approaches regarding the use of AI in the context of space traffic management, in toto. Acknowledgment: The present work was co-funded by the European Union and Greek national funds through the Operational Program "Human Resources Development, Education and Lifelong Learning " (NSRF 2014-2020), under the call "Supporting Researchers with an Emphasis on Young Researchers – Cycle B" (MIS: 5048145).

Keywords: artificial intelligence, space traffic management, space situational awareness, space debris

Procedia PDF Downloads 258
40 Recent Findings of Late Bronze Age Mining and Archaeometallurgy Activities in the Mountain Region of Colchis (Southern Lechkhumi, Georgia)

Authors: Rusudan Chagelishvili, Nino Sulava, Tamar Beridze, Nana Rezesidze, Nikoloz Tatuashvili

Abstract:

The South Caucasus is one of the most important centers of prehistoric metallurgy, known for its Colchian bronze culture. Modern Lechkhumi – historical Mountainous Colchis where the existence of prehistoric metallurgy is confirmed by the discovery of many artifacts is a part of this area. Studies focused on prehistoric smelting sites, related artefacts, and ore deposits have been conducted during last ten years in Lechkhumi. More than 20 prehistoric smelting sites and artefacts associated with metallurgical activities (ore roasting furnaces, slags, crucible, and tuyères fragments) have been identified so far. Within the framework of integrated studies was established that these sites were operating in 13-9 centuries B.C. and used for copper smelting. Palynological studies of slags revealed that chestnut (Castanea sativa) and hornbeam (Carpinus sp.) wood were used as smelting fuel. Geological exploration-analytical studies revealed that copper ore mining, processing, and smelting sites were distributed close to each other. Despite recent complex data, the signs of prehistoric mines (trenches) haven’t been found in this part of the study area so far. Since 2018 the archaeological-geological exploration has been focused on the southern part of Lechkhumi and covered the areas of villages Okureshi and Opitara. Several copper smelting sites (Okureshi 1 and 2, Opitara 1), as well as a Colchian Bronze culture settlement, have been identified here. Three mine workings have been found in the narrow gorge of the river Rtkhmelebisgele in the vicinities of the village Opitara. In order to establish a link between the Opitara-Okureshi archaeometallurgical sites, Late Bronze Age settlements, and mines, various scientific analytical methods -mineralized rock and slags petrography and atomic absorption spectrophotometry (AAS) analysis have been applied. The careful examination of Opitara mine workings revealed that there is a striking difference between the mine #1 on the right bank of the river and mines #2 and #3 on the left bank. The first one has all characteristic features of the Soviet period mine working (e. g. high portal with angular ribs and roof showing signs of blasting). In contrast, mines #2 and #3, which are located very close to each other, have round-shaped portals/entrances, low roofs, and fairly smooth ribs and are filled with thick layers of river sediments and collapsed weathered rock mass. A thorough review of the publications related to prehistoric mine workings revealed some striking similarities between mines #2 and #3 with their worldwide analogues. Apparently, the ore extraction from these mines was conducted by fire-setting applying primitive tools. It was also established that mines are cut in Jurassic mineralized volcanic rocks. Ore minerals (chalcopyrite, pyrite, galena) are related to calcite and quartz veins. The results obtained through the petrochemical and petrography studies of mineralized rock samples from Opitara mines and prehistoric slags are in complete correlation with each other, establishing the direct link between copper mining and smelting within the study area. Acknowledgment: This work was supported by the Shota Rustaveli National Science Foundation of Georgia (grant # FR-19-13022).

Keywords: archaeometallurgy, Mountainous Colchis, mining, ore minerals

Procedia PDF Downloads 179
39 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators

Authors: K. O'Malley

Abstract:

Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.

Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university

Procedia PDF Downloads 32
38 Integrated Mathematical Modeling and Advance Visualization of Magnetic Nanoparticle for Drug Delivery, Drug Release and Effects to Cancer Cell Treatment

Authors: Norma Binti Alias, Che Rahim Che The, Norfarizan Mohd Said, Sakinah Abdul Hanan, Akhtar Ali

Abstract:

This paper discusses on the transportation of magnetic drug targeting through blood within vessels, tissues and cells. There are three integrated mathematical models to be discussed and analyze the concentration of drug and blood flow through magnetic nanoparticles. The cell therapy brought advancement in the field of nanotechnology to fight against the tumors. The systematic therapeutic effect of Single Cells can reduce the growth of cancer tissue. The process of this nanoscale phenomena system is able to measure and to model, by identifying some parameters and applying fundamental principles of mathematical modeling and simulation. The mathematical modeling of single cell growth depends on three types of cell densities such as proliferative, quiescent and necrotic cells. The aim of this paper is to enhance the simulation of three types of models. The first model represents the transport of drugs by coupled partial differential equations (PDEs) with 3D parabolic type in a cylindrical coordinate system. This model is integrated by Non-Newtonian flow equations, leading to blood liquid flow as the medium for transportation system and the magnetic force on the magnetic nanoparticles. The interaction between the magnetic force on drug with magnetic properties produces induced currents and the applied magnetic field yields forces with tend to move slowly the movement of blood and bring the drug to the cancer cells. The devices of nanoscale allow the drug to discharge the blood vessels and even spread out through the tissue and access to the cancer cells. The second model is the transport of drug nanoparticles from the vascular system to a single cell. The treatment of the vascular system encounters some parameter identification such as magnetic nanoparticle targeted delivery, blood flow, momentum transport, density and viscosity for drug and blood medium, intensity of magnetic fields and the radius of the capillary. Based on two discretization techniques, finite difference method (FDM) and finite element method (FEM), the set of integrated models are transformed into a series of grid points to get a large system of equations. The third model is a single cell density model involving the three sets of first order PDEs equations for proliferating, quiescent and necrotic cells change over time and space in Cartesian coordinate which regulates under different rates of nutrients consumptions. The model presents the proliferative and quiescent cell growth depends on some parameter changes and the necrotic cells emerged as the tumor core. Some numerical schemes for solving the system of equations are compared and analyzed. Simulation and computation of the discretized model are supported by Matlab and C programming languages on a single processing unit. Some numerical results and analysis of the algorithms are presented in terms of informative presentation of tables, multiple graph and multidimensional visualization. As a conclusion, the integrated of three types mathematical modeling and the comparison of numerical performance indicates that the superior tool and analysis for solving the complete set of magnetic drug delivery system which give significant effects on the growth of the targeted cancer cell.

Keywords: mathematical modeling, visualization, PDE models, magnetic nanoparticle drug delivery model, drug release model, single cell effects, avascular tumor growth, numerical analysis

Procedia PDF Downloads 428
37 Comparative Proteomic Profiling of Planktonic and Biofilms from Staphylococcus aureus Using Tandem Mass Tag-Based Mass Spectrometry

Authors: Arifur Rahman, Ardeshir Amirkhani, Honghua Hu, Mark Molloy, Karen Vickery

Abstract:

Introduction and Objectives: Staphylococcus aureus and coagulase-negative staphylococci comprises approximately 65% of infections associated with medical devices and are well known for their biofilm formatting ability. Biofilm-related infections are extremely difficult to eradicate owing to their high tolerance to antibiotics and host immune defences. Currently, there is no efficient method for early biofilm detection. A better understanding to enable detection of biofilm specific proteins in vitro and in vivo can be achieved by studying planktonic and different growth phases of biofilms using a proteome analysis approach. Our goal was to construct a reference map of planktonic and biofilm associated proteins of S. aureus. Methods: S. aureus reference strain (ATCC 25923) was used to grow 24 hours planktonic, 3-day wet biofilm (3DWB), and 12-day wet biofilm (12DWB). Bacteria were grown in tryptic soy broth (TSB) liquid medium. Planktonic growth was used late logarithmic bacteria, and the Centres for Disease Control (CDC) biofilm reactor was used to grow 3 days, and 12-day hydrated biofilms, respectively. Samples were subjected to reduction, alkylation and digestion steps prior to Multiplex labelling using Tandem Mass Tag (TMT) 10-plex reagent (Thermo Fisher Scientific). The labelled samples were pooled and fractionated by high pH RP-HPLC which followed by loading of the fractions on a nanoflow UPLC system (Eksigent UPLC system, AB SCIEX). Mass spectrometry (MS) data were collected on an Orbitrap Elite (Thermo Fisher Scientific) Mass Spectrometer. Protein identification and relative quantitation of protein levels were performed using Proteome Discoverer (version 1.3, Thermo Fisher Scientific). After the extraction of protein ratios with Proteome Discoverer, additional processing, and statistical analysis was done using the TMTPrePro R package. Results and Discussion: The present study showed that a considerable proteomic difference exists among planktonic and biofilms from S. aureus. We identified 1636 total extracellular secreted proteins, of which 350 and 137 proteins of 3DWB and 12DWB showed significant abundance variation from planktonic preparation, respectively. Of these, simultaneous up-regulation in between 3DWB and 12DWB proteins such as extracellular matrix-binding protein ebh, enolase, transketolase, triosephosphate isomerase, chaperonin, peptidase, pyruvate kinase, hydrolase, aminotransferase, ribosomal protein, acetyl-CoA acetyltransferase, DNA gyrase subunit A, glycine glycyltransferase and others we found in this biofilm producer. On the contrary, simultaneous down-regulation in between 3DWB and 12DWB proteins such as alpha and delta-hemolysin, lipoteichoic acid synthase, enterotoxin I, serine protease, lipase, clumping factor B, regulatory protein Spx, phosphoglucomutase, and others also we found in this biofilm producer. In addition, we also identified a big percentage of hypothetical proteins including unique proteins. Therefore, a comprehensive knowledge of planktonic and biofilm associated proteins identified by S. aureus will provide a basis for future studies on the development of vaccines and diagnostic biomarkers. Conclusions: In this study, we constructed an initial reference map of planktonic and various growth phase of biofilm associated proteins which might be helpful to diagnose biofilm associated infections.

Keywords: bacterial biofilms, CDC bioreactor, S. aureus, mass spectrometry, TMT

Procedia PDF Downloads 171
36 Digital Adoption of Sales Support Tools for Farmers: A Technology Organization Environment Framework Analysis

Authors: Sylvie Michel, François Cocula

Abstract:

Digital agriculture is an approach that exploits information and communication technologies. These encompass data acquisition tools like mobile applications, satellites, sensors, connected devices, and smartphones. Additionally, it involves transfer and storage technologies such as 3G/4G coverage, low-bandwidth terrestrial or satellite networks, and cloud-based systems. Furthermore, embedded or remote processing technologies, including drones and robots for process automation, along with high-speed communication networks accessible through supercomputers, are integral components of this approach. While farm-level adoption studies regarding digital agricultural technologies have emerged in recent years, they remain relatively limited in comparison to other agricultural practices. To bridge this gap, this study delves into understanding farmers' intention to adopt digital tools, employing the technology, organization, environment framework. A qualitative research design encompassed semi-structured interviews, totaling fifteen in number, conducted with key stakeholders both prior to and following the 2020-2021 COVID-19 lockdowns in France. Subsequently, the interview transcripts underwent thorough thematic content analysis, and the data and verbatim were triangulated for validation. A coding process aimed to systematically organize the data, ensuring an orderly and structured classification. Our research extends its contribution by delineating sub-dimensions within each primary dimension. A total of nine sub-dimensions were identified, categorized as follows: perceived usefulness for communication, perceived usefulness for productivity, and perceived ease of use constitute the first dimension; technological resources, financial resources, and human capabilities constitute the second dimension, while market pressure, institutional pressure, and the COVID-19 situation constitute the third dimension. Furthermore, this analysis enriches the TOE framework by incorporating entrepreneurial orientation as a moderating variable. Managerial orientation emerges as a pivotal factor influencing adoption intention, with producers acknowledging the significance of utilizing digital sales support tools to combat "greenwashing" and elevate their overall brand image. Specifically, it illustrates that producers recognize the potential of digital tools in time-saving and streamlining sales processes, leading to heightened productivity. Moreover, it highlights that the intent to adopt digital sales support tools is influenced by a market mimicry effect. Additionally, it demonstrates a negative association between the intent to adopt these tools and the pressure exerted by institutional partners. Finally, this research establishes a positive link between the intent to adopt digital sales support tools and economic fluctuations, notably during the COVID-19 pandemic. The adoption of sales support tools in agriculture is a multifaceted challenge encompassing three dimensions and nine sub-dimensions. The research delves into the adoption of digital farming technologies at the farm level through the TOE framework. This analysis provides significant insights beneficial for policymakers, stakeholders, and farmers. These insights are instrumental in making informed decisions to facilitate a successful digital transition in agriculture, effectively addressing sector-specific challenges.

Keywords: adoption, digital agriculture, e-commerce, TOE framework

Procedia PDF Downloads 60
35 Optimizing Machine Learning Algorithms for Defect Characterization and Elimination in Liquids Manufacturing

Authors: Tolulope Aremu

Abstract:

The key process steps to produce liquid detergent products will introduce potential defects, such as formulation, mixing, filling, and packaging, which might compromise product quality, consumer safety, and operational efficiency. Real-time identification and characterization of such defects are of prime importance for maintaining high standards and reducing waste and costs. Usually, defect detection is performed by human inspection or rule-based systems, which is very time-consuming, inconsistent, and error-prone. The present study overcomes these limitations in dealing with optimization in defect characterization within the process for making liquid detergents using Machine Learning algorithms. Performance testing of various machine learning models was carried out: Support Vector Machine, Decision Trees, Random Forest, and Convolutional Neural Network on defect detection and classification of those defects like wrong viscosity, color deviations, improper filling of a bottle, packaging anomalies. These algorithms have significantly benefited from a variety of optimization techniques, including hyperparameter tuning and ensemble learning, in order to greatly improve detection accuracy while minimizing false positives. Equipped with a rich dataset of defect types and production parameters consisting of more than 100,000 samples, our study further includes information from real-time sensor data, imaging technologies, and historic production records. The results are that optimized machine learning models significantly improve defect detection compared to traditional methods. Take, for instance, the CNNs, which run at 98% and 96% accuracy in detecting packaging anomaly detection and bottle filling inconsistency, respectively, by fine-tuning the model with real-time imaging data, through which there was a reduction in false positives of about 30%. The optimized SVM model on detecting formulation defects gave 94% in viscosity variation detection and color variation. These values of performance metrics correspond to a giant leap in defect detection accuracy compared to the usual 80% level achieved up to now by rule-based systems. Moreover, this optimization with models can hasten defect characterization, allowing for detection time to be below 15 seconds from an average of 3 minutes using manual inspections with real-time processing of data. With this, the reduction in time will be combined with a 25% reduction in production downtime because of proactive defect identification, which can save millions annually in recall and rework costs. Integrating real-time machine learning-driven monitoring drives predictive maintenance and corrective measures for a 20% improvement in overall production efficiency. Therefore, the optimization of machine learning algorithms in defect characterization optimum scalability and efficiency for liquid detergent companies gives improved operational performance to higher levels of product quality. In general, this method could be conducted in several industries within the Fast moving consumer Goods industry, which would lead to an improved quality control process.

Keywords: liquid detergent manufacturing, defect detection, machine learning, support vector machines, convolutional neural networks, defect characterization, predictive maintenance, quality control, fast-moving consumer goods

Procedia PDF Downloads 18
34 The Impact of β Nucleating Agents and Carbon-Based Nanomaterials on Water Vapor Permeability of Polypropylene Composite Films

Authors: Glykeria A. Visvini, George Ν. Mathioudakis, Amaia Soto Beobide, George A. Voyiatzis

Abstract:

Polymer nanocomposites are materials in which a polymer matrix is reinforced with nanoscale inclusions, such as nanoparticles, nanoplates, or nanofibers. These nanoscale inclusions can significantly enhance the mechanical, thermal, electrical, and other properties of the polymer matrix, making them attractive for a wide range of industrial applications. These properties can be tailored by adjusting the type and the concentration of the nanoinclusions, which provides a high degree of flexibility in their design and development. An important property that polymeric membranes can exhibit is water vapor permeability (WVP). This can be accomplished by various methods, including the incorporation of micro/nano-fillers into the polymer matrix. In this way, a micro/nano-pore network can be formed, allowing water vapor to permeate through the membrane. At the same time, the membrane can be stretched uni- or bi-axially, creating aligned or cross-linked micropores in the composite, respectively, which can also increase the WVP. Nowadays, in industry, stretched films reinforced with CaCO3 develop micro-porosity sufficient to give them breathability characteristics. Carbon-based nanomaterials, such as graphene oxide (GO), are tentatively expected to be able to effectively improve the WVP of corresponding composite polymer films. The presence in the GO structure of various functional oxidizing groups enhances its ability to attract and channel water molecules, exploiting the unique large surface area of graphene that allows the rapid transport of water molecules. Polypropylene (PP) is widely used in various industrial applications due to its desirable properties, including good chemical resistance, excellent thermal stability, low cost, and easy processability. The specific properties of PP are highly influenced by its crystalline behavior, which is determined by its processing conditions. The development of the β-crystalline phase in PP, in combination with stretching, is anticipating improving the microporosity of the polymer matrix, thereby enhancing its WVP. The aim of present study is to create breathable PP composite membranes using carbon-based nanomaterials, such as graphene oxide (GO), reduced graphene oxide (rGO), and graphene nanoplatelets (GNPs). Unlike traditional methods that rely on the drawing process to enhance the WVP of PP, this study intents to develop a low-cost approach using melt mixing with β-nucleating agents and carbon fillers to create highly breathable PP composite membranes. The study aims to investigate how the concentration of these additives affects the water vapor transport properties of the resulting PP films/membranes. The presence of β-nucleating agents and carbon fillers is expected to enhance β-phase growth in PP, while an alternation between β- and α-phase is expected to lead to improved microporosity and WVP. Our ambition is to develop highly breathable PP composite films with superior performance and at a lower cost compared to the benchmark. Acknowledgment: This research has been co‐financed by the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call «Special Actions "AQUACULTURE"-"INDUSTRIAL MATERIALS"-"OPEN INNOVATION IN CULTURE"» (project code: Τ6YBP-00337)

Keywords: carbon based nanomaterials, nanocomposites, nucleating agent, polypropylene, water vapor permeability

Procedia PDF Downloads 86
33 Black-Box-Optimization Approach for High Precision Multi-Axes Forward-Feed Design

Authors: Sebastian Kehne, Alexander Epple, Werner Herfs

Abstract:

A new method for optimal selection of components for multi-axes forward-feed drive systems is proposed in which the choice of motors, gear boxes and ball screw drives is optimized. Essential is here the synchronization of electrical and mechanical frequency behavior of all axes because even advanced controls (like H∞-controls) can only control a small part of the mechanical modes – namely only those of observable and controllable states whose value can be derived from the positions of extern linear length measurement systems and/or rotary encoders on the motor or gear box shafts. Further problems are the unknown processing forces like cutting forces in machine tools during normal operation which make the estimation and control via an observer even more difficult. To start with, the open source Modelica Feed Drive Library which was developed at the Laboratory for Machine Tools, and Production Engineering (WZL) is extended from one axis design to the multi axes design. It is capable to simulate the mechanical, electrical and thermal behavior of permanent magnet synchronous machines with inverters, different gear boxes and ball screw drives in a mechanical system. To keep the calculation time down analytical equations are used for field and torque producing equivalent circuit, heat dissipation and mechanical torque at the shaft. As a first step, a small machine tool with a working area of 635 x 315 x 420 mm is taken apart, and the mechanical transfer behavior is measured with an impulse hammer and acceleration sensors. With the frequency transfer functions, a mechanical finite element model is built up which is reduced with substructure coupling to a mass-damper system which models the most important modes of the axes. The model is modelled with Modelica Feed Drive Library and validated by further relative measurements between machine table and spindle holder with a piezo actor and acceleration sensors. In a next step, the choice of possible components in motor catalogues is limited by derived analytical formulas which are based on well-known metrics to gain effective power and torque of the components. The simulation in Modelica is run with different permanent magnet synchronous motors, gear boxes and ball screw drives from different suppliers. To speed up the optimization different black-box optimization methods (Surrogate-based, gradient-based and evolutionary) are tested on the case. The objective that was chosen is to minimize the integral of the deviations if a step is given on the position controls of the different axes. Small values are good measures for a high dynamic axes. In each iteration (evaluation of one set of components) the control variables are adjusted automatically to have an overshoot less than 1%. It is obtained that the order of the components in optimization problem has a deep impact on the speed of the black-box optimization. An approach to do efficient black-box optimization for multi-axes design is presented in the last part. The authors would like to thank the German Research Foundation DFG for financial support of the project “Optimierung des mechatronischen Entwurfs von mehrachsigen Antriebssystemen (HE 5386/14-1 | 6954/4-1)” (English: Optimization of the Mechatronic Design of Multi-Axes Drive Systems).

Keywords: ball screw drive design, discrete optimization, forward feed drives, gear box design, linear drives, machine tools, motor design, multi-axes design

Procedia PDF Downloads 286
32 Bringing Together Student Collaboration and Research Opportunities to Promote Scientific Understanding and Outreach Through a Seismological Community

Authors: Michael Ray Brunt

Abstract:

China has been the site of some of the most significant earthquakes in history; however, earthquake monitoring has long been the provenance of universities and research institutions. The China Digital Seismographic Network was initiated in 1983 and improved significantly during 1992-1993. Data from the CDSN is widely used by government and research institutions, and, generally, this data is not readily accessible to middle and high school students. An educational seismic network in China is needed to provide collaboration and research opportunities for students and engaging students around the country in scientific understanding of earthquake hazards and risks while promoting community awareness. In 2022, the Tsinghua International School (THIS) Seismology Team, made up of enthusiastic students and facilitated by two experienced teachers, was established. As a group, the team’s objective is to install seismographs in schools throughout China, thus creating an educational seismic network that shares data from the THIS Educational Seismic Network (THIS-ESN) and facilitates collaboration. The THIS-ESN initiative will enhance education and outreach in China about earthquake risks and hazards, introduce seismology to a wider audience, stimulate interest in research among students, and develop students’ programming, data collection and analysis skills. It will also encourage and inspire young minds to pursue science, technology, engineering, the arts, and math (STEAM) career fields. The THIS-ESN utilizes small, low-cost RaspberryShake seismographs as a powerful tool linked into a global network, giving schools and the public access to real-time seismic data from across China, increasing earthquake monitoring capabilities in the perspective areas and adding to the available data sets regionally and worldwide helping create a denser seismic network. The RaspberryShake seismograph is compatible with free seismic data viewing platforms such as SWARM, RaspberryShake web programs and mobile apps are designed specifically towards teaching seismology and seismic data interpretation, providing opportunities to enhance understanding. The RaspberryShake is powered by an operating system embedded in the Raspberry Pi, which makes it an easy platform to teach students basic computer communication concepts by utilizing processing tools to investigate, plot, and manipulate data. THIS Seismology Team believes strongly in creating opportunities for committed students to become part of the seismological community by engaging in analysis of real-time scientific data with tangible outcomes. Students will feel proud of the important work they are doing to understand the world around them and become advocates spreading their knowledge back into their homes and communities, helping to improve overall community resilience. We trust that, in studying the results seismograph stations yield, students will not only grasp how subjects like physics and computer science apply in real life, and by spreading information, we hope students across the country can appreciate how and why earthquakes bear on their lives, develop practical skills in STEAM, and engage in the global seismic monitoring effort. By providing such an opportunity to schools across the country, we are confident that we will be an agent of change for society.

Keywords: collaboration, outreach, education, seismology, earthquakes, public awareness, research opportunities

Procedia PDF Downloads 71
31 Bridging Minds and Nature: Revolutionizing Elementary Environmental Education Through Artificial Intelligence

Authors: Hoora Beheshti Haradasht, Abooali Golzary

Abstract:

Environmental education plays a pivotal role in shaping the future stewards of our planet. Leveraging the power of artificial intelligence (AI) in this endeavor presents an innovative approach to captivate and educate elementary school children about environmental sustainability. This paper explores the application of AI technologies in designing interactive and personalized learning experiences that foster curiosity, critical thinking, and a deep connection to nature. By harnessing AI-driven tools, virtual simulations, and personalized content delivery, educators can create engaging platforms that empower children to comprehend complex environmental concepts while nurturing a lifelong commitment to protecting the Earth. With the pressing challenges of climate change and biodiversity loss, cultivating an environmentally conscious generation is imperative. Integrating AI in environmental education revolutionizes traditional teaching methods by tailoring content, adapting to individual learning styles, and immersing students in interactive scenarios. This paper delves into the potential of AI technologies to enhance engagement, comprehension, and pro-environmental behaviors among elementary school children. Modern AI technologies, including natural language processing, machine learning, and virtual reality, offer unique tools to craft immersive learning experiences. Adaptive platforms can analyze individual learning patterns and preferences, enabling real-time adjustments in content delivery. Virtual simulations, powered by AI, transport students into dynamic ecosystems, fostering experiential learning that goes beyond textbooks. AI-driven educational platforms provide tailored content, ensuring that environmental lessons resonate with each child's interests and cognitive level. By recognizing patterns in students' interactions, AI algorithms curate customized learning pathways, enhancing comprehension and knowledge retention. Utilizing AI, educators can develop virtual field trips and interactive nature explorations. Children can navigate virtual ecosystems, analyze real-time data, and make informed decisions, cultivating an understanding of the delicate balance between human actions and the environment. While AI offers promising educational opportunities, ethical concerns must be addressed. Safeguarding children's data privacy, ensuring content accuracy, and avoiding biases in AI algorithms are paramount to building a trustworthy learning environment. By merging AI with environmental education, educators can empower children not only with knowledge but also with the tools to become advocates for sustainable practices. As children engage in AI-enhanced learning, they develop a sense of agency and responsibility to address environmental challenges. The application of artificial intelligence in elementary environmental education presents a groundbreaking avenue to cultivate environmentally conscious citizens. By embracing AI-driven tools, educators can create transformative learning experiences that empower children to grasp intricate ecological concepts, forge an intimate connection with nature, and develop a strong commitment to safeguarding our planet for generations to come.

Keywords: artificial intelligence, environmental education, elementary children, personalized learning, sustainability

Procedia PDF Downloads 82
30 Obesity and Lifestyle of Students in Roumanian Southeastern Region

Authors: Mariana Stuparu-Cretu, Doina-Carina Voinescu, Rodica-Mihaela Dinica, Daniela Borda, Camelia Vizireanu, Gabriela Iordachescu, Camelia Busila

Abstract:

Obesity is involved in the etiology or acceleration of progression of important non-communicable diseases, such as: metabolic, cardiovascular, rheumatological, oncological and depression. It is a need to prevent the obesity occurrence, like a key link in disease management. From this point of view, the best approach is to early educate youngsters upon the need for a healthy nutrition lifestyle associated with constant physical activities. The objective of the study was to assess correlations between weight condition, physical activities and food preferences of students from South East Romania. Questionnaires were applied on high school students in Galati: 1006 girls and 880 boys, aged between 14 and 19 years (being approved by Local School Inspectorate and the Ethics Committee of the 'Dunarea de Jos' University of Galati). The collected answers have been statistically processed by using the multivariate regression method (PLS2) by Unscramble X program (Camo, Norway). Multiple variables such as age group, body mass index, nutritional habits and physical activities were separately analysed, depending on gender and general mathematical models were proposed to explain the obesity trend at an early age. The study results show that overweight and obesity are present in less than a fifth of the adolescents who were surveyed. With a very small variation and a strong correlation of over 86% for 99% of the cases, a general preference for sweet foods, nocturnal eating associated with computer work and a reduced period of physical activity is noticed for girls. In addition, the overweight girls consume sweet juices and alcohol, although a percentage of them also practice the gym. There is also a percentage of the normoponderal girls that consume high caloric foods which predispose this group to turn into overweight cases in time. Within the studied group, statistics for the boys show a positive correlation of almost 87% for over 96% of cases. They prefer high calories foods, fast food, and sweet juices, and perform medium physical activities. Both overweight and underweight boys are more sedentary. Over 15% of girls and over a quarter of boys consume alcohol. All these bad eating habits seem to increase with age, for both sexes. To conclude, obesity and overweight assessed in adolescents in S-E Romania reveal nonsignificant percentage differences between boys and girls. However, young people in this area of the country are sedentary in general; a significant percentage prefers sweets / sweet juices / fast-food and practice computer nourishing. The authors consider that at this age, it is very useful to adapt nutritional education by new methods of food processing and market supply. This would require an early understanding of the difference among foods and nutrients and the benefits of physical activities integrated into the healthy current lifestyle, as a measure for preventing and managing non-communicable chronic diseases related to nutritional errors and sedentarism. Acknowledgment— This study has been partial founded by the Francophone University Agency, Project Réseau régional dans le domaine de la santé, la nutrition et la sécurité alimentaire (SaIN), no.21899/ 06.09.2017.

Keywords: adolescents, body mass index, nutritional habits, obesity, physical activity

Procedia PDF Downloads 258
29 Evaluating Viability of Using South African Forestry Process Biomass Waste Mixtures as an Alternative Pyrolysis Feedstock in the Production of Bio Oil

Authors: Thembelihle Portia Lubisi, Malusi Ntandoyenkosi Mkhize, Jonas Kalebe Johakimu

Abstract:

Fertilizers play an important role in maintaining the productivity and quality of plants. Inorganic fertilizers (containing nitrogen, phosphorus, and potassium) are largely used in South Africa as they are considered inexpensive and highly productive. When applied, a portion of the excess fertilizer will be retained in the soil, a portion enters water streams due to surface runoff or the irrigation system adopted. Excess nutrient from the fertilizers entering the water stream eventually results harmful algal blooms (HABs) in freshwater systems, which not only disrupt wildlife but can also produce toxins harmful to humans. Use of agro-chemicals such as pesticides and herbicides has been associated with increased antimicrobial resistance (AMR) in humans as the plants are consumed by humans. This resistance of bacterial poses a threat as it prevents the Health sector from being able to treat infectious disease. Archaeological studies have found that pyrolysis liquids were already used in the time of the Neanderthal as a biocide and plant protection product. Pyrolysis is thermal degradation process of plant biomass or organic material under anaerobic conditions leading to production of char, bio-oils and syn gases. Bio-oil constituents can be categorized as water soluble (wood vinegar) and water insoluble fractions (tar and light oils). Wood vinegar (pyro-ligneous acid) is said to contain contains highly oxygenated compounds including acids, alcohols, aldehydes, ketones, phenols, esters, furans, and other multifunctional compounds with various molecular weights and compositions depending on the biomass material derived from and pyrolysis operating conditions. Various researchers have found the wood vinegar to be efficient in the eradication of termites, effective in plant protection and plant growth, has antibacterial characteristics and was found effective in inhibiting the micro-organisms such as candida yeast, E-coli, etc. This study investigated characterisation of South African forestry product processing waste with intention of evaluating the potential of using the respective biomass waste as feedstock for boil oil production via pyrolysis process. Ability to use biomass waste materials in production of wood-vinegar has advantages that it does not only allows for reduction of environmental pollution and landfill requirement, but it also does not negatively affect food security. The biomass wastes investigated were from the popular tree types in KZN, which are, pine saw dust (PSD), pine bark (PB), eucalyptus saw dust (ESD) and eucalyptus bark (EB). Furthermore, the research investigates the possibility of mixing the different wastes with an aim to lessen the cost of raw material separation prior to feeding into pyrolysis process and mixing also increases the amount of biomass material available for beneficiation. A 50/50 mixture of PSD and ESD (EPSD) and mixture containing pine saw dust; eucalyptus saw dust, pine bark and eucalyptus bark (EPSDB). Characterisation of the biomass waste will look at analysis such as proximate (volatiles, ash, fixed carbon), ultimate (carbon, hydrogen, nitrogen, oxygen, sulphur), high heating value, structural (cellulose, hemicellulose and lignin) and thermogravimetric analysis.

Keywords: characterisation, biomass waste, saw dust, wood waste

Procedia PDF Downloads 68
28 Familiarity with Intercultural Conflicts and Global Work Performance: Testing a Theory of Recognition Primed Decision-Making

Authors: Thomas Rockstuhl, Kok Yee Ng, Guido Gianasso, Soon Ang

Abstract:

Two meta-analyses show that intercultural experience is not related to intercultural adaptation or performance in international assignments. These findings have prompted calls for a deeper grounding of research on international experience in the phenomenon of global work. Two issues, in particular, may limit current understanding of the relationship between international experience and global work performance. First, intercultural experience is too broad a construct that may not sufficiently capture the essence of global work, which to a large part involves sensemaking and managing intercultural conflicts. Second, the psychological mechanisms through which intercultural experience affects performance remains under-explored, resulting in a poor understanding of how experience is translated into learning and performance outcomes. Drawing on recognition primed decision-making (RPD) research, the current study advances a cognitive processing model to highlight the importance of intercultural conflict familiarity. Compared to intercultural experience, intercultural conflict familiarity is a more targeted construct that captures individuals’ previous exposure to dealing with intercultural conflicts. Drawing on RPD theory, we argue that individuals’ intercultural conflict familiarity enhances their ability to make accurate judgments and generate effective responses when intercultural conflicts arise. In turn, the ability to make accurate situation judgements and effective situation responses is an important predictor of global work performance. A relocation program within a multinational enterprise provided the context to test these hypotheses using a time-lagged, multi-source field study. Participants were 165 employees (46% female; with an average of 5 years of global work experience) from 42 countries who relocated from country to regional offices as part a global restructuring program. Within the first two weeks of transfer to the regional office, employees completed measures of their familiarity with intercultural conflicts, cultural intelligence, cognitive ability, and demographic information. They also completed an intercultural situational judgment test (iSJT) to assess their situation judgment and situation response. The iSJT comprised four validated multimedia vignettes of challenging intercultural work conflicts and prompted employees to provide protocols of their situation judgment and situation response. Two research assistants, trained in intercultural management but blind to the study hypotheses, coded the quality of employee’s situation judgment and situation response. Three months later, supervisors rated employees’ global work performance. Results using multilevel modeling (vignettes nested within employees) support the hypotheses that greater familiarity with intercultural conflicts is positively associated with better situation judgment, and that situation judgment mediates the effect of intercultural familiarity on situation response quality. Also, aggregated situation judgment and situation response quality both predicted supervisor-rated global work performance. Theoretically, our findings highlight the important but under-explored role of familiarity with intercultural conflicts; a shift in attention from the general nature of international experience assessed in terms of number and length of overseas assignments. Also, our cognitive approach premised on RPD theory offers a new theoretical lens to understand the psychological mechanisms through which intercultural conflict familiarity affects global work performance. Third, and importantly, our study contributes to the global talent identification literature by demonstrating that the cognitive processes engaged in resolving intercultural conflicts predict actual performance in the global workplace.

Keywords: intercultural conflict familiarity, job performance, judgment and decision making, situational judgment test

Procedia PDF Downloads 179
27 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts

Authors: William Michael Short

Abstract:

Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.

Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics

Procedia PDF Downloads 132