Search results for: degree-days methods
12917 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients
Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho
Abstract:
Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper
Procedia PDF Downloads 14712916 The Impact of Non-Surgical and Non-Medical Interventions on the Treatment of Infertile Women with Ovarian Reserve Below One and Early Menopause Symptoms
Authors: Flora Tajiki
Abstract:
This study investigates the effectiveness of non-surgical and non-medical interventions in treating infertile women with severely diminished ovarian reserve (below one), low Anti-Müllerian Hormone (AMH) levels, and symptoms of early menopause. The intervention included yoga, sunlight exposure, vitamin and mineral supplementation, relaxation techniques, and daily prayers performed both before sleep and upon waking. These methods were applied to women who had shown poor response to high-dose fertility treatments, such as IVF and microinjection cycles, leading to low-quality egg production. The focus was on women with severely reduced ovarian reserve and early menopause symptoms, some of whom continued to experience relatively regular menstrual cycles despite the onset of these symptoms. This treatment was aimed at women for whom conventional fertility methods had been ineffective. The study sample consisted of 120 married women, aged 25 to 45, from the provinces of Tehran, Alborz, and western Iran, with 35 participants completing the intervention. Individual factors such as residence, education, employment status, marriage duration, family infertility history, and previous infertility treatments were examined, with income considered as a contextual variable. The results indicate that AMH may not be a definitive marker of ovarian reserve, as lifestyle modifications, such as those implemented in this study, were associated with increased AMH levels, the return of regular menstrual cycles, and successful pregnancies. No short- or long-term complications were reported during the two-year follow-up, highlighting the potential benefits of non-surgical interventions for women with early menopause symptoms and diminished ovarian reserve.Keywords: anti-müllerian hormone, infertility, ovarian reserve, early menopause, fertility, women’s health, lifestyle modification, pregnancy
Procedia PDF Downloads 2712915 Numerical Investigation of Turbulent Inflow Strategy in Wind Energy Applications
Authors: Arijit Saha, Hassan Kassem, Leo Hoening
Abstract:
Ongoing climate change demands the increasing use of renewable energies. Wind energy plays an important role in this context since it can be applied almost everywhere in the world. To reduce the costs of wind turbines and to make them more competitive, simulations are very important since experiments are often too costly if at all possible. The wind turbine on a vast open area experiences the turbulence generated due to the atmosphere, so it was of utmost interest from this research point of view to generate the turbulence through various Inlet Turbulence Generation methods like Precursor cyclic and Kaimal Spectrum Exponential Coherence (KSEC) in the computational simulation domain. To be able to validate computational fluid dynamic simulations of wind turbines with the experimental data, it is crucial to set up the conditions in the simulation as close to reality as possible. This present work, therefore, aims at investigating the turbulent inflow strategy and boundary conditions of KSEC and providing a comparative analysis alongside the Precursor cyclic method for Large Eddy Simulation within the context of wind energy applications. For the generation of the turbulent box through KSEC method, firstly, the constrained data were collected from an auxiliary channel flow, and later processing was performed with the open-source tool PyconTurb, whereas for the precursor cyclic, only the data from the auxiliary channel were sufficient. The functionality of these methods was studied through various statistical properties such as variance, turbulent intensity, etc with respect to different Bulk Reynolds numbers, and a conclusion was drawn on the feasibility of KSEC method. Furthermore, it was found necessary to verify the obtained data with DNS case setup for its applicability to use it as a real field CFD simulation.Keywords: Inlet Turbulence Generation, CFD, precursor cyclic, KSEC, large Eddy simulation, PyconTurb
Procedia PDF Downloads 9812914 White Wine Discrimination Based on Deconvoluted Surface Enhanced Raman Spectroscopy Signals
Authors: Dana Alina Magdas, Nicoleta Simona Vedeanu, Ioana Feher, Rares Stiufiuc
Abstract:
Food and beverages authentication using rapid and non-expensive analytical tools represents nowadays an important challenge. In this regard, the potential of vibrational techniques in food authentication has gained an increased attention during the last years. For wines discrimination, Raman spectroscopy appears more feasible to be used as compared with IR (infrared) spectroscopy, because of the relatively weak water bending mode in the vibrational spectroscopy fingerprint range. Despite this, the use of Raman technique in wine discrimination is in an early stage. Taking this into consideration, the wine discrimination potential of surface-enhanced Raman scattering (SERS) technique is reported in the present work. The novelty of this study, compared with the previously reported studies, concerning the application of vibrational techniques in wine discrimination consists in the fact that the present work presents the wines differentiation based on the individual signals obtained from deconvoluted spectra. In order to achieve wines classification with respect to variety, geographical origin and vintage, the peaks intensities obtained after spectra deconvolution were compared using supervised chemometric methods like Linear Discriminant Analysis (LDA). For this purpose, a set of 20 white Romanian wines from different viticultural Romanian regions four varieties, was considered. Chemometric methods applied directly to row SERS experimental spectra proved their efficiency, but discrimination markers identification found to be very difficult due to the overlapped signals as well as for the band shifts. By using this approach, a better general view related to the differences that appear among the wines in terms of compositional differentiation could be reached.Keywords: chemometry, SERS, variety, wines discrimination
Procedia PDF Downloads 16212913 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters
Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev
Abstract:
Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters
Procedia PDF Downloads 19812912 Performance Comparison of Droop Control Methods for Parallel Inverters in Microgrid
Authors: Ahmed Ismail, Mustafa Baysal
Abstract:
Although the energy source in the world is mainly based on fossil fuels today, there is a need for alternative energy generation systems, which are more economic and environmentally friendly, due to continuously increasing demand of electric energy and lacking power resources and networks. Distributed Energy Resources (DERs) such as fuel cells, wind and solar power have recently become widespread as alternative generation. In order to solve several problems that might be encountered when integrating DERs to power system, the microgrid concept has been proposed. A microgrid can operate both grid connected and island mode to benefit both utility and customers. For most distributed energy resources (DER) which are connected in parallel in LV-grid like micro-turbines, wind plants, fuel cells and PV cells electrical power is generated as a direct current (DC) and converted to an alternative currents (AC) by inverters. So the inverters are assumed to be primary components in a microgrid. There are many control techniques of parallel inverters to manage active and reactive sharing of the loads. Some of them are based on droop method. In literature, the studies are usually focused on improving the transient performance of inverters. In this study, the performance of two different controllers based on droop control method is compared for the inverters operated in parallel without any communication feedback. For this aim, a microgrid in which inverters are controlled by conventional droop controller and modified droop controller is designed. Modified controller is obtained by adding PID into conventional droop control. Active and reactive power sharing performance, voltage and frequency responses of those control methods are measured in several operational cases. Study cases have been simulated by MATLAB-SIMULINK.Keywords: active and reactive power sharing, distributed generation, droop control, microgrid
Procedia PDF Downloads 59312911 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm
Authors: Annalakshmi G., Sakthivel Murugan S.
Abstract:
This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization
Procedia PDF Downloads 16412910 Formal Models of Sanitary Inspections Teams Activities
Authors: Tadeusz Nowicki, Radosław Pytlak, Robert Waszkowski, Jerzy Bertrandt, Anna Kłos
Abstract:
This paper presents methods for formal modeling of activities in the area of sanitary inspectors outbreak of food-borne diseases. The models allow you to measure the characteristics of the activities of sanitary inspection and as a result allow improving the performance of sanitary services and thus food security.Keywords: food-borne disease, epidemic, sanitary inspection, mathematical models
Procedia PDF Downloads 30412909 From Clients to Colleagues: Supporting the Professional Development of Survivor Social Work Students
Authors: Stephanie Jo Marchese
Abstract:
This oral presentation is a reflective piece regarding current social work teaching methods that value and devalue the lived experiences of survivor students. This presentation grounds the term ‘survivor’ in feminist frameworks. A survivor-defined approach to feminist advocacy assumes an individual’s agency, considers each case and needs independent of generalizations, and provides resources and support to empower victims. Feminist ideologies are ripe arenas to update and influence the rapport-building schools of social work have with these students. Survivor-based frameworks are rooted in nuanced understandings of intersectional realities, staunchly combat both conscious and unconscious deficit lenses wielded against victims, elevate lived experiences to the realm of experiential expertise, and offer alternatives to traditional power structures and knowledge exchanges. Actively importing a survivor framework into the methodology of social work teaching breaks open barriers many survivor students have faced in institutional settings, this author included. The profession of social work is at an important crux of change, both in the United States and globally. The United States is currently undergoing a radical change in its citizenry and outlier communities have taken to the streets again in opposition to their othered-ness. New waves of students are entering this field, emboldened by their survival of personal and systemic oppressions- heavily influenced by third-wave feminism, critical race theory, queer theory, among other post-structuralist ideologies. Traditional models of sociological and psychological studies are actively being challenged. The profession of social work was not founded on the diagnosis of disorders but rather a grassroots-level activism that heralded and demanded resources for oppressed communities. Institutional and classroom acceptance and celebration of survivor narratives can catapult the resurgence of these values needed in the profession’s service-delivery models and put social workers back in the driver's seat of social change (a combined advocacy and policy perspective), moving away from outsider-based intervention models. Survivor students should be viewed as agents of change, not solely former victims and clients. The ideas of this presentation proposal are supported through various qualitative interviews, as well as reviews of ‘best practices’ in the field of education that incorporate feminist methods of inclusion and empowerment. Curriculum and policy recommendations are also offered.Keywords: deficit lens bias, empowerment theory, feminist praxis, inclusive teaching models, strengths-based approaches, social work teaching methods
Procedia PDF Downloads 29112908 Assessment Client Satisfaction with Family Physician in Health Care Centers of Jiroft County and Its Relationship with Physician’ Demographic Variables
Authors: Babak Nemat Shahrbabaki, Arezo Fallahi, Masoomeh Hashemian
Abstract:
Introduction: Health and safety are basic components of civil right. Health care systems in different countries were influenced by political, economic and cultural circumstances. In order to health services to people, these systems are organized with different forms, methods such as: prevention, treatment and rehabilitation and in this among, public satisfaction with the services provided is important. This study aimed to determine client satisfaction with family physician and relationship with physician’ demographic variables in health care centers of Jiroft county, Iran. Methods: This is a descriptive-analytical study. The collective data tool was a self-made questionnaire with two parts. The first part comprised demographic characteristics, and the second part contained 11 items for the assessment of satisfaction with family physician from different aspects. In addition, questionnaire, reliability and validity were confirmed. Random simple sampling method was used to determine samples. 234 people referred to the health centers filled questionnaire. The data were analyzed using SPSS software, and inferential statistical analysis was performed. Findings: The majority of the study population were women, married, and aged between 18 and 62 years (mean= 30.09±10.71). Total average satisfaction score was 42.63±3.68. Overall satisfaction averages were 9.47% very high, 30.04% high, 33.09% moderate, 15.12% low, and 12.28% very low. Except lodge on of family physician none of physician’ demographic variables did not effect on satisfaction index. Discussion & Conclusion: The Results showed that mean of satisfaction indexes of family physicians was high and lodge on of family physician effected on this index. Informing people about the main goals of family-doctor program will help to promote the quality of program and increase people satisfaction.Keywords: family physician program, satisfaction, health-care centers, client
Procedia PDF Downloads 44612907 Fabrication of Superhydrophobic Galvanized Steel by Sintering Zinc Nanopowder
Authors: Francisco Javier Montes Ruiz-Cabello, Guillermo Guerrero-Vacas, Sara Bermudez-Romero, Miguel Cabrerizo Vilchez, Miguel Angel Rodriguez-Valverde
Abstract:
Galvanized steel is one of the widespread metallic materials used in industry. It consists on a iron-based alloy (steel) coated with a layer of zinc with variable thickness. The zinc is aimed to prevent the inner steel from corrosion and staining. Its production is cheaper than the stainless steel and this is the reason why it is employed in the construction of materials with large dimensions in aeronautics, urban/ industrial edification or ski-resorts. In all these applications, turning the natural hydrophilicity of the metal surface into superhydrophobicity is particularly interesting and would open a wide variety of additional functionalities. However, producing a superhydrophobic surface on galvanized steel may be a very difficult task. Superhydrophobic surfaces are characterized by a specific surface texture which is reached either by coating the surface with a material that incorporates such texture, or by conducting several roughening methods. Since galvanized steel is already a coated material, the incorporation of a second coating may be undesired. On the other hand, the methods that are recurrently used to incorporate the surface texture leading to superhydrophobicity in metals are aggressive and may damage their surface. In this work, we used a novel strategy which goal is to produce superhydrophobic galvanized steel by a two-step non-aggressive process. The first process is aimed to create a hierarchical structure by incorporating zinc nanoparticles sintered on the surface at a temperature slightly lower than the zinc’s melting point. The second one is a hydrophobization by a thick fluoropolymer layer deposition. The wettability of the samples is characterized in terms of tilting plate and bouncing drop experiments, while the roughness is analyzed by confocal microscopy. The durability of the produced surfaces was also explored.Keywords: galvanaized steel, superhydrophobic surfaces, sintering nanoparticles, zinc nanopowder
Procedia PDF Downloads 15112906 Creating Database and Building 3D Geological Models: A Case Study on Bac Ai Pumped Storage Hydropower Project
Authors: Nguyen Chi Quang, Nguyen Duong Tri Nguyen
Abstract:
This article is the first step to research and outline the structure of the geotechnical database in the geological survey of a power project; in the context of this report creating the database that has been carried out for the Bac Ai pumped storage hydropower project. For the purpose of providing a method of organizing and storing geological and topographic survey data and experimental results in a spatial database, the RockWorks software is used to bring optimal efficiency in the process of exploiting, using, and analyzing data in service of the design work in the power engineering consulting. Three-dimensional (3D) geotechnical models are created from the survey data: such as stratigraphy, lithology, porosity, etc. The results of the 3D geotechnical model in the case of Bac Ai pumped storage hydropower project include six closely stacked stratigraphic formations by Horizons method, whereas modeling of engineering geological parameters is performed by geostatistical methods. The accuracy and reliability assessments are tested through error statistics, empirical evaluation, and expert methods. The three-dimensional model analysis allows better visualization of volumetric calculations, excavation and backfilling of the lake area, tunneling of power pipelines, and calculation of on-site construction material reserves. In general, the application of engineering geological modeling makes the design work more intuitive and comprehensive, helping construction designers better identify and offer the most optimal design solutions for the project. The database always ensures the update and synchronization, as well as enables 3D modeling of geological and topographic data to integrate with the designed data according to the building information modeling. This is also the base platform for BIM & GIS integration.Keywords: database, engineering geology, 3D Model, RockWorks, Bac Ai pumped storage hydropower project
Procedia PDF Downloads 17112905 Overall Function and Symptom Impact of Self-Applied Myofascial Release in Adult Patients With Fibromyalgia. A Seven-Week Pilot Study
Authors: Domenica Tambasco, Riina Bray, Sophia Jaworski, Gillian Grant, Celeste Corkery
Abstract:
Fibromyalgia is a chronic condition characterized by widespread musculoskeletal pain, fatigue, and reduced function. Management of symptoms include medications, physical treatments and mindfulness therapies. Myofascial Release is a modality that has been successfully applied in var-ious musculoskeletal conditions. However, to the author’s best knowledge, it is not yet recog-nized as a self-management therapy option in Fibromyalgia. In this study, we investigated whether Self-applied Myofascial Release (SMR) is associated with overall improved function and symptoms in Fibromyalgia. Methods: Eligible adult patients with a confirmed diagnosis of Fibromyalgia at Women’s College Hospital were recruited to SMR. Sessions ran for 1 hour once a week for 7 weeks, led by the same two Physiotherapists knowledgeable in this physical treat-ment modality. The main outcome measure was an overall impact score for function and symp-toms based on the validated assessment tool for Fibromyalgia, the Revised Fibromyalgia Impact Questionnaire (FIQR), measured pre and post-intervention. Both descriptive and analytical methods were applied and reported. Results: We analyzed results using a paired t-test to deter-mine if there was a statistically significant difference in mean FIQR scores between initial (pre-intervention) and final (post-intervention) scores. A clinically significant difference in FIQR was defined as a reduction in score by 10 or more points. Conclusions: Our pilot study showed that SMR appeared to be a safe and effective intervention for our Fibromyalgia participants and the overall impact on function and symptoms occurred in only 7 weeks. Further studies with larger sample sizes comparing SMR to other physical treatment modalities (such as stretching) in an RCT are recommended.Keywords: fibromyalgia, myofascial release, physical therapy, FIQR
Procedia PDF Downloads 7712904 Evaluation of the Efficiency of French Language Educational Software for Learners in Semnan Province, Iran
Authors: Alireza Hashemi
Abstract:
In recent decades, language teaching methodology has undergone significant changes due to the advent of computers and the growth of educational software. French language education has also benefited from these developments, and various software has been produced to facilitate the learning of this language. However, the question arises whether these software programs meet the educational needs of Iranian learners, particularly in Semnan Province. The aim of this study is to evaluate the efficiency and effectiveness of French language educational software for learners in Semnan Province, considering educational, cultural, and technical criteria. In this study, content analysis and performance evaluation methods were used to examine the educational software ‘Français Facile’. This software was evaluated based on criteria such as teaching methods, cultural compatibility, and technical features. To collect data, standardized questionnaires and semi-structured interviews with learners in Semnan Province were used. Additionally, the SPSS statistical software was employed for quantitative data analysis, and the thematic analysis method was used for qualitative data. The results indicated that the ‘Français Facile’ software has strengths such as providing diverse educational content and an interactive learning environment. However, some weaknesses include the lack of alignment of educational content with the learning culture of learners in Semnan Province and technical issues in software execution. Statistical data showed that 65% of learners were satisfied with the educational content, but 55% reported issues related to cultural alignment with their needs. This study indicates that to enhance the efficiency of French language educational software, there is a need to localize educational content and improve technical infrastructure. Producing locally adapted educational software can improve the quality of language learning and increase the motivation of learners in Semnan Province. This research emphasizes the importance of understanding the cultural and educational needs of learners in the development of educational software and recommends that developers of educational software pay special attention to these aspects.Keywords: educational software, French language, Iran, learners in Semnan province
Procedia PDF Downloads 4412903 Preparation, Characterisation, and Measurement of the in vitro Cytotoxicity of Mesoporous Silica Nanoparticles Loaded with Cytotoxic Pt(II) Oxadiazoline Complexes
Authors: G. Wagner, R. Herrmann
Abstract:
Cytotoxic platinum compounds play a major role in the chemotherapy of a large number of human cancers. However, due to the severe side effects for the patient and other problems associated with their use, there is a need for the development of more efficient drugs and new methods for their selective delivery to the tumours. One way to achieve the latter could be in the use of nanoparticular substrates that can adsorb or chemically bind the drug. In the cell, the drug is supposed to be slowly released, either by physical desorption or by dissolution of the particle framework. Ideally, the cytotoxic properties of the platinum drug unfold only then, in the cancer cell and over a longer period of time due to the gradual release. In this paper, we report on our first steps in this direction. The binding properties of a series of cytotoxic Pt(II) oxadiazoline compounds to mesoporous silica particles has been studied by NMR and UV/vis spectroscopy. High loadings were achieved when the Pt(II) compound was relatively polar, and has been dissolved in a relatively nonpolar solvent before the silica was added. Typically, 6-10 hours were required for complete equilibration, suggesting the adsorption did not only occur to the outer surface but also to the interior of the pores. The untreated and Pt(II) loaded particles were characterised by C, H, N combustion analysis, BET/BJH nitrogen sorption, electron microscopy (REM and TEM) and EDX. With the latter methods we were able to demonstrate the homogenous distribution of the Pt(II) compound on and in the silica particles, and no Pt(II) bulk precipitate had formed. The in vitro cytotoxicity in a human cancer cell line (HeLa) has been determined for one of the new platinum compounds adsorbed to mesoporous silica particles of different size, and compared with the corresponding compound in solution. The IC50 data are similar in all cases, suggesting that the release of the Pt(II) compound was relatively fast and possibly occurred before the particles reached the cells. Overall, the platinum drug is chemically stable on silica and retained its activity upon prolonged storage.Keywords: cytotoxicity, mesoporous silica, nanoparticles, platinum compounds
Procedia PDF Downloads 32212902 Computer-Integrated Surgery of the Human Brain, New Possibilities
Authors: Ugo Galvanetto, Pirto G. Pavan, Mirco Zaccariotto
Abstract:
The discipline of Computer-integrated surgery (CIS) will provide equipment able to improve the efficiency of healthcare systems and, which is more important, clinical results. Surgeons and machines will cooperate in new ways that will extend surgeons’ ability to train, plan and carry out surgery. Patient specific CIS of the brain requires several steps: 1 - Fast generation of brain models. Based on image recognition of MR images and equipped with artificial intelligence, image recognition techniques should differentiate among all brain tissues and segment them. After that, automatic mesh generation should create the mathematical model of the brain in which the various tissues (white matter, grey matter, cerebrospinal fluid …) are clearly located in the correct positions. 2 – Reliable and fast simulation of the surgical process. Computational mechanics will be the crucial aspect of the entire procedure. New algorithms will be used to simulate the mechanical behaviour of cutting through cerebral tissues. 3 – Real time provision of visual and haptic feedback A sophisticated human-machine interface based on ergonomics and psychology will provide the feedback to the surgeon. The present work will address in particular point 2. Modelling the cutting of soft tissue in a structure as complex as the human brain is an extremely challenging problem in computational mechanics. The finite element method (FEM), that accurately represents complex geometries and accounts for material and geometrical nonlinearities, is the most used computational tool to simulate the mechanical response of soft tissues. However, the main drawback of FEM lies in the mechanics theory on which it is based, classical continuum Mechanics, which assumes matter is a continuum with no discontinuity. FEM must resort to complex tools such as pre-defined cohesive zones, external phase-field variables, and demanding remeshing techniques to include discontinuities. However, all approaches to equip FEM computational methods with the capability to describe material separation, such as interface elements with cohesive zone models, X-FEM, element erosion, phase-field, have some drawbacks that make them unsuitable for surgery simulation. Interface elements require a-priori knowledge of crack paths. The use of XFEM in 3D is cumbersome. Element erosion does not conserve mass. The Phase Field approach adopts a diffusive crack model instead of describing true tissue separation typical of surgical procedures. Modelling discontinuities, so difficult when using computational approaches based on classical continuum Mechanics, is instead easy for novel computational methods based on Peridynamics (PD). PD is a non-local theory of mechanics formulated with no use of spatial derivatives. Its governing equations are valid at points or surfaces of discontinuity, and it is, therefore especially suited to describe crack propagation and fragmentation problems. Moreover, PD does not require any criterium to decide the direction of crack propagation or the conditions for crack branching or coalescence; in the PD-based computational methods, cracks develop spontaneously in the way which is the most convenient from an energy point of view. Therefore, in PD computational methods, crack propagation in 3D is as easy as it is in 2D, with a remarkable advantage with respect to all other computational techniques.Keywords: computational mechanics, peridynamics, finite element, biomechanics
Procedia PDF Downloads 8212901 Characteristics of Middle Grade Students' Solution Strategies While Reasoning the Correctness of the Statements Related to Numbers
Authors: Ayşegül Çabuk, Mine Işıksal
Abstract:
Mathematics is a sense-making activity so that it requires meaningful learning. Hence based on this idea, meaningful mathematical connections are necessary to learn mathematics. At that point, the major question has become that which educational methods can provide opportunities to provide mathematical connections and to understand mathematics. The amalgam of reasoning and proof can be the one of the methods that creates opportunities to learn mathematics in a meaningful way. However, even if reasoning and proof should be included from prekindergarten to grade 12, studies in literature generally include secondary school students and pre-service mathematics teachers. With the light of the idea that the amalgam of reasoning and proof has significant effect on middle school students' mathematical learning, this study aims to investigate middle grade students' tendencies while reasoning the correctness of statements related to numbers. The sample included 272 middle grade students, specifically 69 of them were sixth grade students (25.4%), 101 of them were seventh grade students (37.1%) and 102 of them were eighth grade students (37.5%). Data was gathered through an achievement test including 2 essay types of problems about algebra. The answers of two items were analyzed both quantitatively and qualitatively in terms of students' solutions strategies while reasoning the correctness of the statements. Similar on the findings in the literature, most of the students, in all grade levels, used numerical examples to judge the statements. Moreover the results also showed that the majority of these students appear to believe that providing one or more selected examples is sufficient to show the correctness of the statement. Hence based on the findings of the study, even students in earlier ages have proving and reasoning abilities their reasoning's generally based on the empirical evidences. Therefore, it is suggested that examples and example-based reasoning can be a fundamental role on to generate systematical reasoning and proof insight in earlier ages.Keywords: reasoning, mathematics learning, middle grade students
Procedia PDF Downloads 42512900 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models
Authors: Ainouna Bouziane
Abstract:
The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.Keywords: electron tomography, supported catalysts, nanometrology, error assessment
Procedia PDF Downloads 9012899 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique
Authors: Rafid Doulab
Abstract:
Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration
Procedia PDF Downloads 11912898 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods
Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino
Abstract:
In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer
Procedia PDF Downloads 34612897 Utilization of Activated Carbon for the Extraction and Separation of Methylene Blue in the Presence of Acid Yellow 61 Using an Inclusion Polymer Membrane
Authors: Saâd Oukkass, Abderrahim Bouftou, Rachid Ouchn, L. Lebrun, Miloudi Hlaibi
Abstract:
We invariably exist in a world steeped in colors, whether in our clothing, food, cosmetics, or even medications. However, most of the dyes we use pose significant problems, being both harmful to the environment and resistant to degradation. Among these dyes, methylene blue and acid yellow 61 stand out, commonly used to dye various materials such as cotton, wood, and silk. Fortunately, various methods have been developed to treat and remove these polluting dyes, among which membrane processes play a prominent role. These methods are praised for their low energy consumption, ease of operation, and their ability to achieve effective separation of components. Adsorption on activated carbon is also a widely employed technique, complementing the basic processes. It proves particularly effective in capturing and removing organic compounds from water due to its substantial specific surface area while retaining its properties unchanged. In the context of our study, we examined two crucial aspects. Firstly, we explored the possibility of selectively extracting methylene blue from a mixture containing another dye, acid yellow 61, using a polymer inclusion membrane (PIM) made of PVA. After characterizing the morphology and porosity of the membrane, we applied kinetic and thermodynamic models to determine the values of permeability (P), initial flux (J0), association constant (Kass), and apparent diffusion coefficient (D*). Subsequently, we measured activation parameters (activation energy (Ea), enthalpy (ΔH#ass), entropy (ΔS#)). Finally, we studied the effect of activated carbon on the processes carried out through the membrane, demonstrating a clear improvement. These results make the membrane developed in this study a potentially pivotal player in the field of membrane separation.Keywords: dyes, methylene blue, membrane, activated carbon
Procedia PDF Downloads 8212896 Association Between Disability and Obesity Status Among US Adults: Findings From 2019-2021 National Health Interview Survey (NHIS)
Authors: Chimuanya Osuji, Kido Uyamasi, Morgan Bradley
Abstract:
Introduction: Obesity is a major risk factor for many chronic diseases, with higher rates occurring among certain populations. Even though disparities in obesity rates exist for those with disabilities, few studies have assessed the association between disability and obesity status. This study aims to examine the association between type of disability and obesity status among US adults during the Covid-19 pandemic (2019-2021). Methods: Data for this cross-sectional study was obtained from the 2019, 2020 and 2021 NHIS. Multinomial logistic regressions were used to assess the relationship between each type of disability and obesity status (reference= normal/underweight). Each model adjusted for demographic, health status and health-related quality of life variables. Statistical analyses were conducted using SAS version 9.4. Results: Of the 82,632 US adults who completed the NHIS in 2019, 2020, and 2021. 8.9% (n= 7,354) reported at least 1 disability-related condition. Respondents reported having a disability across vision (1.5%), hearing (1.5%), mobility (5.3%), communication (0.8%), cognition (2.4%) and self-care (1.1%) domains. After adjusting for covariates, adults with at least 1 disability-related condition were about 30% more likely to have moderate-severe obesity (AOR=1.3; 95% CI=1.11, 1.53). Mobility was the only disability category positively associated with mild obesity (AOR=1.16; 95% CI=1.01, 1.35) and moderate/severe obesity (AOR=1.6; 95% CI=1.35, 1.89). Individuals with vision disability were about 35% less likely to have mild obesity (AOR=0.66; 95% CI=0.51, 0.86) and moderate-severe obesity (AOR=0.66; 95% CI= 0.48, 0.9). Individuals with hearing disability were 28% less likely to have mild obesity (AOR=0.72; 95% CI= 0.56, 0.94). Individuals with communication disability were about 30% less likely to be overweight (AOR=0.66; 95% CI=0.47, 0.93) and 50% less likely to have mild obesity (AOR=0.45; 95% CI= 0.29, 0.71). Individuals with cognitive disability were about 25% less likely to have mild obesity and about 35% less likely to have moderate-severe obesity. Individuals with self-care disability were about 30% less likely to be overweight. Conclusion: Mobility-related disabilities are significantly associated with obesity status among adults residing in the United States. Researchers and policy makers should implement obesity intervention methods that can address the gap in obesity prevalence rates among those with and without disabilities.Keywords: cognition, disability, mobility, obesity
Procedia PDF Downloads 7112895 Uncontrollable Inaccuracy in Inverse Problems
Authors: Yu Menshikov
Abstract:
In this paper the influence of errors of function derivatives in initial time which have been obtained by experiment (uncontrollable inaccuracy) to the results of inverse problem solution was investigated. It was shown that these errors distort the inverse problem solution as a rule near the beginning of interval where the solution are analyzed. Several methods for remove the influence of uncontrollable inaccuracy have been suggested.Keywords: inverse problems, filtration, uncontrollable inaccuracy
Procedia PDF Downloads 50912894 Comparative Study on Daily Discharge Estimation of Soolegan River
Authors: Redvan Ghasemlounia, Elham Ansari, Hikmet Kerem Cigizoglu
Abstract:
Hydrological modeling in arid and semi-arid regions is very important. Iran has many regions with these climate conditions such as Chaharmahal and Bakhtiari province that needs lots of attention with an appropriate management. Forecasting of hydrological parameters and estimation of hydrological events of catchments, provide important information that used for design, management and operation of water resources such as river systems, and dams, widely. Discharge in rivers is one of these parameters. This study presents the application and comparison of some estimation methods such as Feed-Forward Back Propagation Neural Network (FFBPNN), Multi Linear Regression (MLR), Gene Expression Programming (GEP) and Bayesian Network (BN) to predict the daily flow discharge of the Soolegan River, located at Chaharmahal and Bakhtiari province, in Iran. In this study, Soolegan, station was considered. This Station is located in Soolegan River at 51° 14՜ Latitude 31° 38՜ longitude at North Karoon basin. The Soolegan station is 2086 meters higher than sea level. The data used in this study are daily discharge and daily precipitation of Soolegan station. Feed Forward Back Propagation Neural Network(FFBPNN), Multi Linear Regression (MLR), Gene Expression Programming (GEP) and Bayesian Network (BN) models were developed using the same input parameters for Soolegan's daily discharge estimation. The results of estimation models were compared with observed discharge values to evaluate performance of the developed models. Results of all methods were compared and shown in tables and charts.Keywords: ANN, multi linear regression, Bayesian network, forecasting, discharge, gene expression programming
Procedia PDF Downloads 56212893 Capturing Public Voices: The Role of Social Media in Heritage Management
Authors: Mahda Foroughi, Bruno de Anderade, Ana Pereira Roders
Abstract:
Social media platforms have been increasingly used by locals and tourists to express their opinions about buildings, cities, and built heritage in particular. Most recently, scholars have been using social media to conduct innovative research on built heritage and heritage management. Still, the application of artificial intelligence (AI) methods to analyze social media data for heritage management is seldom explored. This paper investigates the potential of short texts (sentences and hashtags) shared through social media as a data source and artificial intelligence methods for data analysis for revealing the cultural significance (values and attributes) of built heritage. The city of Yazd, Iran, was taken as a case study, with a particular focus on windcatchers, key attributes conveying outstanding universal values, as inscribed on the UNESCO World Heritage List. This paper has three subsequent phases: 1) state of the art on the intersection of public participation in heritage management and social media research; 2) methodology of data collection and data analysis related to coding people's voices from Instagram and Twitter into values of windcatchers over the last ten-years; 3) preliminary findings on the comparison between opinions of locals and tourists, sentiment analysis, and its association with the values and attributes of windcatchers. Results indicate that the age value is recognized as the most important value by all interest groups, while the political value is the least acknowledged. Besides, the negative sentiments are scarcely reflected (e.g., critiques) in social media. Results confirm the potential of social media for heritage management in terms of (de)coding and measuring the cultural significance of built heritage for windcatchers in Yazd. The methodology developed in this paper can be applied to other attributes in Yazd and also to other case studies.Keywords: social media, artificial intelligence, public participation, cultural significance, heritage, sentiment analysis
Procedia PDF Downloads 11712892 Digital Portfolio as Mediation to Enhance Willingness to Communicate in English
Authors: Saeko Toyoshima
Abstract:
This research will discuss if performance tasks with technology would enhance students' willingness to communicate. The present study investigated how Japanese learners of English would change their attitude to communication in their target language by experiencing a performance task, called 'digital portfolio', in the classroom, applying the concepts of action research. The study adapted questionnaires including four-Likert and open-end questions as mixed-methods research. There were 28 students in the class. Many of Japanese university students with low proficiency (A1 in Common European Framework of References in Language Learning and Teaching) have difficulty in communicating in English due to the low proficiency and the lack of practice in and outside of the classroom at secondary education. They should need to mediate between themselves in the world of L1 and L2 with completing a performance task for communication. This paper will introduce the practice of CALL class where A1 level students have made their 'digital portfolio' related to the topics of TED® (Technology, Entertainment, Design) Talk materials. The students had 'Portfolio Session' twice in one term, once in the middle, and once at the end of the course, where they introduced their portfolio to their classmates and international students in English. The present study asked the students to answer a questionnaire about willingness to communicate twice, once at the end of the first term and once at the end of the second term. The four-Likert questions were statistically analyzed with a t-test, and the answers to open-end questions were analyzed to clarify the difference between them. They showed that the students had a more positive attitude to communication in English and enhanced their willingness to communicate through the experiences of the task. It will be the implication of this paper that making and presenting portfolio as a performance task would lead them to construct themselves in English and enable them to communicate with the others enjoyably and autonomously.Keywords: action research, digital portfoliio, computer-assisted language learning, ELT with CALL system, mixed methods research, Japanese English learners, willingness to communicate
Procedia PDF Downloads 12012891 Jagiellonian-PET: A Novel TOF-PET Detector Based on Plastic Scintillators
Authors: P. Moskal, T. Bednarski, P. Bialas, E. Czerwinski, A. Gajos, A. Gruntowski, D. Kaminska, L. Kaplon, G. Korcyl, P. Kowalski, T. Kozik, W. Krzemien, E. Kubicz, Sz. Niedzwiecki, M. Palka, L. Raczynski, Z. Rudy, P. Salabura, N. G. Sharma, M. Silarski, A. Slomski, J. Smyrski, A. Strzelecki, A. Wieczorek, W. Wislicki, M. Zielinski, N. Zon
Abstract:
A new concept and results of the performance tests of the TOF-PET detection system developed at the Jagiellonian University will be presented. The novelty of the concept lies in employing long strips of polymer scintillators instead of crystals as detectors of annihilation quanta, and in using predominantly the timing of signals instead of their amplitudes for the reconstruction of Lines-of-Response. The diagnostic chamber consists of plastic scintillator strips readout by pairs of photo multipliers arranged axially around a cylindrical surface. To take advantage of the superior timing properties of plastic scintillators the signals are probed in the voltage domain with the accuracy of 20 ps by a newly developed electronics, and the data are collected by the novel trigger-less and reconfigurable data acquisition system. The hit-position and hit-time are reconstructed by the dedicated reconstruction methods based on the compressing sensing theory and the library of synchronized model signals. The solutions are subject to twelve patent applications. So far a time-of-flight resolution of ~120 ps (sigma) was achieved for a double-strip prototype with 30 cm field-of-view (FOV). It is by more than a factor of two better than TOF resolution achievable in current TOF-PET modalities and at the same time the FOV of 30 cm long prototype is significantly larger with respect to typical commercial PET devices. The Jagiellonian PET (J-PET) detector with plastic scintillators arranged axially possesses also another advantage. Its diagnostic chamber is free of any electronic devices and magnetic materials thus giving unique possibilities of combining J-PET with CT and J-PET with MRI for scanning the same part of a patient at the same time with both methods.Keywords: PET-CT, PET-MRI, TOF-PET, scintillator
Procedia PDF Downloads 49812890 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru
Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar
Abstract:
Nowadays, heritage building information modeling (HBIM) is considered an efficient tool to represent and manage information of cultural heritage (CH). The basis of this tool relies on a 3D model generally obtained from a cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired level of development (LOD), level of information (LOI), grade of generation (GOG), as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit, and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings, and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills, and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models families, respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI, and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources since the BIM software used has a free student license.Keywords: cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit
Procedia PDF Downloads 14712889 Effectiveness of Cold Calling on Students’ Behavior and Participation during Class Discussions: Punishment or Opportunity to Shine
Authors: Maimuna Akram, Khadija Zia, Sohaib Naseer
Abstract:
Pedagogical objectives and the nature of the course content may lead instructors to take varied approaches to selecting a student for the cold call, specifically in a studio setup where students work on different projects independently and show progress work time to time at scheduled critiques. Cold-calling often proves to be an effective tool in eliciting a response without enforcing judgment onto the recipients. While there is a mixed range of behavior exhibited by students who are cold-called, a classification of responses from anxiety-provoking to inspiring may be elicited; there is a need for a greater understanding of utilizing the exchanges in bringing about fruitful and engaging outcomes of studio discussions. This study aims to unravel the dimensions of utilizing the cold-call approach in a didactic exchange within studio pedagogy. A questionnaire survey was conducted in an undergraduate class at Arts and Design School. The impact of cold calling on students’ participation was determined through various parameters, including course choice, participation frequency, students’ comfortability, and teaching methodology. After analyzing the surveys, specific classroom teachers were interviewed to provide a qualitative perspective of the faculty. It was concluded that cold-calling increases students’ participation frequency and also increases preparation for class. Around 67% of students responded that teaching methods play an important role in learning activities and students’ participation during class discussions. 84% of participants agreed that cold calling is an effective way of learning. According to research, cold-calling can be done in large numbers without making students uncomfortable. As a result, the findings of this study support the use of this instructional method to encourage more students to participate in class discussions.Keywords: active learning, class discussion, class participation, cold calling, pedagogical methods, student engagement
Procedia PDF Downloads 3812888 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures
Authors: Salima Kouici, Abdelkader Khelladi
Abstract:
In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering
Procedia PDF Downloads 484