Search results for: methods of Lagrange multipliers
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14926

Search results for: methods of Lagrange multipliers

13936 Role of Social Media for Institutional Branding: Ethics of Communication Review

Authors: Iva Ariani, Mohammad Alvi Pratama

Abstract:

Currently, the world of communication experiences a rapid development. There are many ways of communication utilized in line with the development of science which creates many technologies that encourage a rapid development of communication system. However, despite giving convenience for the society, the development of communication system is not accompanied by the development of applicable values and regulations. Therefore, it raises many issues regarding false information or hoax which can influence the society’s mindset. This research aims to know the role of social media towards the reputation of an institution using a communication ethics study. It is a qualitative research using interview, observation, and literature study for collecting data. Then, the data will be analyzed using philosophical methods which are hermeneutic and deduction methods. This research is expected to show the role of social media in developing an institutional reputation in ethical review.

Keywords: social media, ethics, communication, reputation

Procedia PDF Downloads 190
13935 Morphological Characteristics and Pollination Requirement in Red Pitaya (Hylocereus Spp.)

Authors: Dinh Ha, Tran, Chung-Ruey Yen

Abstract:

This study explored the morphological characteristics and effects of pollination methods on fruit set and characteristics in four red pitaya (Hylocereus spp.) clones. The distinctive morphological recognition and classification among pitaya clones were confirmed by the stem, flower and fruit features. The fruit production season was indicated from the beginning of May to the end of August, the beginning of September with 6-7 flowering cycles per year. The floral stage took from 15-19 days and fruit duration spent 30–32 days. VN White, fully self-compatible, obtained high fruit set rates (80.0-90.5 %) in all pollination treatments and the maximum fruit weight (402.6 g) in hand self- and (403.4 g) in open-pollination. Chaozhou 5 was partially self-compatible while Orejona and F11 were completely self-incompatible. Hand cross-pollination increased significantly fruit set (95.8; 88.4 and 90.2 %) and fruit weight (374.2; 281.8 and 416.3 g) in Chaozhou 5, Orejona, and F11, respectively. TSS contents were not much influenced by pollination methods.

Keywords: Hylocereus spp., morphology, floral phenology, pollination requirement

Procedia PDF Downloads 288
13934 The Mechanical Properties of a Small-Size Seismic Isolation Rubber Bearing for Bridges

Authors: Yi F. Wu, Ai Q. Li, Hao Wang

Abstract:

Taking a novel type of bridge bearings with the diameter being 100mm as an example, the theoretical analysis, the experimental research as well as the numerical simulation of the bearing were conducted. Since the normal compression-shear machines cannot be applied to the small-size bearing, an improved device to test the properties of the bearing was proposed and fabricated. Besides, the simulation of the bearing was conducted on the basis of the explicit finite element software ANSYS/LS-DYNA, and some parameters of the bearing are modified in the finite element model to effectively reduce the computation cost. Results show that all the research methods are capable of revealing the fundamental properties of the small-size bearings, and a combined use of these methods can better catch both the integral properties and the inner detailed mechanical behaviors of the bearing.

Keywords: ANSYS/LS-DYNA, compression shear, contact analysis, explicit algorithm, small-size

Procedia PDF Downloads 166
13933 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals

Authors: Bahareh Ansari

Abstract:

Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.

Keywords: best practices, data visualization, literature review, open government data

Procedia PDF Downloads 90
13932 Mining Multicity Urban Data for Sustainable Population Relocation

Authors: Xu Du, Aparna S. Varde

Abstract:

In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. Experiments so far reveal that data mining methods discover useful knowledge from the multicity urban data. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.

Keywords: data mining, environmental modeling, sustainability, urban planning

Procedia PDF Downloads 283
13931 Innovative Technologies Functional Methods of Dental Research

Authors: Sergey N. Ermoliev, Margarita A. Belousova, Aida D. Goncharenko

Abstract:

Application of the diagnostic complex of highly informative functional methods (electromyography, reodentography, laser Doppler flowmetry, reoperiodontography, vital computer capillaroscopy, optical tissue oximetry, laser fluorescence diagnosis) allows to perform a multifactorial analysis of the dental status and to prescribe complex etiopathogenetic treatment. Introduction. It is necessary to create a complex of innovative highly informative and safe functional diagnostic methods for improvement of the quality of patient treatment by the early detection of stomatologic diseases. The purpose of the present study was to investigate the etiology and pathogenesis of functional disorders identified in the pathology of hard tissue, dental pulp, periodontal, oral mucosa and chewing function, and the creation of new approaches to the diagnosis of dental diseases. Material and methods. 172 patients were examined. Density of hard tissues of the teeth and jaw bone was studied by intraoral ultrasonic densitometry (USD). Electromyographic activity of masticatory muscles was assessed by electromyography (EMG). Functional state of dental pulp vessels assessed by reodentography (RDG) and laser Doppler flowmetry (LDF). Reoperiodontography method (RPG) studied regional blood flow in the periodontal tissues. Microcirculatory vascular periodontal studied by vital computer capillaroscopy (VCC) and laser Doppler flowmetry (LDF). The metabolic level of the mucous membrane was determined by optical tissue oximetry (OTO) and laser fluorescence diagnosis (LFD). Results and discussion. The results obtained revealed changes in mineral density of hard tissues of the teeth and jaw bone, the bioelectric activity of masticatory muscles, regional blood flow and microcirculation in the dental pulp and periodontal tissues. LDF and OTO methods estimated fluctuations of saturation level and oxygen transport in microvasculature of periodontal tissues. With LFD identified changes in the concentration of enzymes (nicotinamide, flavins, lipofuscin, porphyrins) involved in metabolic processes Conclusion. Our preliminary results confirmed feasibility and safety the of intraoral ultrasound densitometry technique in the density of bone tissue of periodontium. Conclusion. Application of the diagnostic complex of above mentioned highly informative functional methods allows to perform a multifactorial analysis of the dental status and to prescribe complex etiopathogenetic treatment.

Keywords: electromyography (EMG), reodentography (RDG), laser Doppler flowmetry (LDF), reoperiodontography method (RPG), vital computer capillaroscopy (VCC), optical tissue oximetry (OTO), laser fluorescence diagnosis (LFD)

Procedia PDF Downloads 263
13930 Multiscale Modelling of Textile Reinforced Concrete: A Literature Review

Authors: Anicet Dansou

Abstract:

Textile reinforced concrete (TRC)is increasingly used nowadays in various fields, in particular civil engineering, where it is mainly used for the reinforcement of damaged reinforced concrete structures. TRC is a composite material composed of multi- or uni-axial textile reinforcements coupled with a fine-grained cementitious matrix. The TRC composite is an alternative solution to the traditional Fiber Reinforcement Polymer (FRP) composite. It has good mechanical performance and better temperature stability but also, it makes it possible to meet the criteria of sustainable development better.TRCs are highly anisotropic composite materials with nonlinear hardening behavior; their macroscopic behavior depends on multi-scale mechanisms. The characterization of these materials through numerical simulation has been the subject of many studies. Since TRCs are multiscale material by definition, numerical multi-scale approaches have emerged as one of the most suitable methods for the simulation of TRCs. They aim to incorporate information pertaining to microscale constitute behavior, mesoscale behavior, and macro-scale structure response within a unified model that enables rapid simulation of structures. The computational costs are hence significantly reduced compared to standard simulation at a fine scale. The fine scale information can be implicitly introduced in the macro scale model: approaches of this type are called non-classical. A representative volume element is defined, and the fine scale information are homogenized over it. Analytical and computational homogenization and nested mesh methods belong to these approaches. On the other hand, in classical approaches, the fine scale information are explicitly introduced in the macro scale model. Such approaches pertain to adaptive mesh refinement strategies, sub-modelling, domain decomposition, and multigrid methods This research presents the main principles of numerical multiscale approaches. Advantages and limitations are identified according to several criteria: the assumptions made (fidelity), the number of input parameters required, the calculation costs (efficiency), etc. A bibliographic study of recent results and advances and of the scientific obstacles to be overcome in order to achieve an effective simulation of textile reinforced concrete in civil engineering is presented. A comparative study is further carried out between several methods for the simulation of TRCs used for the structural reinforcement of reinforced concrete structures.

Keywords: composites structures, multiscale methods, numerical modeling, textile reinforced concrete

Procedia PDF Downloads 92
13929 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods

Authors: Vinayak Bassi, Rajpreet Singh

Abstract:

Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.

Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing

Procedia PDF Downloads 144
13928 Electroencephalogram Based Alzheimer Disease Classification using Machine and Deep Learning Methods

Authors: Carlos Roncero-Parra, Alfonso Parreño-Torres, Jorge Mateo Sotos, Alejandro L. Borja

Abstract:

In this research, different methods based on machine/deep learning algorithms are presented for the classification and diagnosis of patients with mental disorders such as alzheimer. For this purpose, the signals obtained from 32 unipolar electrodes identified by non-invasive EEG were examined, and their basic properties were obtained. More specifically, different well-known machine learning based classifiers have been used, i.e., support vector machine (SVM), Bayesian linear discriminant analysis (BLDA), decision tree (DT), Gaussian Naïve Bayes (GNB), K-nearest neighbor (KNN) and Convolutional Neural Network (CNN). A total of 668 patients from five different hospitals have been studied in the period from 2011 to 2021. The best accuracy is obtained was around 93 % in both ADM and ADA classifications. It can be concluded that such a classification will enable the training of algorithms that can be used to identify and classify different mental disorders with high accuracy.

Keywords: alzheimer, machine learning, deep learning, EEG

Procedia PDF Downloads 106
13927 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 291
13926 Axial Load Capacity of Drilled Shafts from In-Situ Test Data at Semani Site, in Albania

Authors: Neritan Shkodrani, Klearta Rrushi, Anxhela Shaha

Abstract:

Generally, the design of axial load capacity of deep foundations is based on the data provided from field tests, such as SPT (Standard Penetration Test) and CPT (Cone Penetration Test) tests. This paper reports the results of axial load capacity analysis of drilled shafts at a construction site at Semani, in Fier county, Fier prefecture in Albania. In this case, the axial load capacity analyses are based on the data of 416 SPT tests and 12 CPTU tests, which are carried out in this site construction using 12 boreholes (10 borings of a depth 30.0 m and 2 borings of a depth of 80.0m). The considered foundation widths range from 0.5m to 2.5 m and foundation embedment lengths is fixed at a value of 25m. SPT – based analytical methods from the Japanese practice of design (Building Standard Law of Japan) and CPT – based analytical Eslami and Fellenius methods are used for obtaining axial ultimate load capacity of drilled shafts. The considered drilled shaft (25m long and 0.5m - 2.5m in diameter) is analyzed for the soil conditions of each borehole. The values obtained from sets of calculations are shown in different charts. Then the reported axial load capacity values acquired from SPT and CPTU data are compared and some conclusions are found related to the mentioned methods of calculations.

Keywords: deep foundations, drilled shafts, axial load capacity, ultimate load capacity, allowable load capacity, SPT test, CPTU test

Procedia PDF Downloads 93
13925 Research on Community-based Nature Education Design at the Gateway Communities of National Parks

Authors: Yulin Liang

Abstract:

Under the background of protecting ecology, natural education is an effective way for people to understand nature. At the same time, it is a new means of sustainable development of eco-tourism, which can improve the functions of China 's protected areas and develop new business formats for the development of national parks. This study takes national park gateway communities as the research object and uses literature review, inductive reasoning and other research methods to sort out the development process of natural education in China and the research progress of natural education design in national park gateway communities. Finally, it discuss how gateway communities can use natural education to transform their development methods and provide theoretical and practical basis for the development of gateway communities in national parks.

Keywords: nature education, gateway communities, national park, sustainable development

Procedia PDF Downloads 47
13924 Active Learning Methods in Mathematics

Authors: Daniela Velichová

Abstract:

Plenty of ideas on how to adopt active learning methods in education are available nowadays. Mathematics is a subject where the active involvement of students is required in particular in order to achieve desirable results regarding sustainable knowledge and deep understanding. The present article is based on the outcomes of an Erasmus+ project DrIVE-MATH, that was aimed at developing a novel and integrated framework to teach maths classes in engineering courses at the university level. It is fundamental for students from the early years of their academic life to have agile minds. They must be prepared to adapt to their future working environments, where enterprises’ views are always evolving, where all collaborate in teams, and relations between peers are thought for the well-being of the whole - workers and company profit. This reality imposes new requirements on higher education in terms of adaptation of different pedagogical methods, such as project-based and active-learning methods used within the course curricula. Active learning methodologies are regarded as an effective way to prepare students to meet the challenges posed by enterprises and to help them in building critical thinking, analytic reasoning, and insight to the solved complex problems from different perspectives. Fostering learning-by-doing activities in the pedagogical process can help students to achieve learning independence, as they could acquire deeper conceptual understanding by experimenting with the abstract concept in a more interesting, useful, and meaningful way. Clear information about learning outcomes and goals might help students to take more responsibility for their learning results. Active learning methods implemented by the project team members in their teaching practice, eduScrum and Jigsaw in particular, proved to provide better scientific and soft skills support to students than classical teaching methods. EduScrum method enables teachers to generate a working environment that stimulates students' working habits and self-initiative as they become aware of their responsibilities within the team, their own acquired knowledge, and their abilities to solve problems independently, though in collaboration with other team members. This method enhances collaborative learning, as students are working in teams towards a common goal - knowledge acquisition, while they are interacting with each other and evaluated individually. Teams consisting of 4-5 students work together on a list of problems - sprint; each member is responsible for solving one of them, while the group leader – a master, is responsible for the whole team. A similar principle is behind the Jigsaw technique, where the classroom activity makes students dependent on each other to succeed. Students are divided into groups, and assignments are split into pieces, which need to be assembled by the whole group to complete the (Jigsaw) puzzle. In this paper, analysis of students’ perceptions concerning the achievement of deeper conceptual understanding in mathematics and the development of soft skills, such as self-motivation, critical thinking, flexibility, leadership, responsibility, teamwork, negotiation, and conflict management, is presented. Some new challenges are discussed as brought by introducing active learning methods in the basic mathematics courses. A few examples of sprints developed and used in teaching basic maths courses at technical universities are presented in addition.

Keywords: active learning methods, collaborative learning, conceptual understanding, eduScrum, Jigsaw, soft skills

Procedia PDF Downloads 38
13923 Evaluation of Antioxidants in Medicinal plant Limoniastrum guyonianum

Authors: Assia Belfar, Mohamed Hadjadj, Messaouda Dakmouche, Zineb Ghiaba

Abstract:

Introduction: This study aims to phytochemical screening; Extracting the active compounds and estimate the effectiveness of antioxidant in Medicinal plants desert Limoniastrum guyonianum (Zeïta) from South Algeria. Methods: Total phenolic content and total flavonoid content using Folin-Ciocalteu and aluminum chloride colorimetric methods, respectively. The total antioxidant capacity was estimated by the following methods: DPPH (1.1-diphenyl-2-picrylhydrazyl radical) and reducing power assay. Results: Phytochemical screening of the plant part reveals the presence of phenols, saponins, flavonoids and tannins. While alkaloids and Terpenoids were absent. The acetonic extract of L. guyonianum was extracted successively with ethyl acetate and butanol. Extraction of yield varied widely in the L. guyonianum ranging from (0.9425 %to 11.131%). The total phenolic content ranged from 53.33 mg GAE/g DW to 672.79 mg GAE/g DW. The total flavonoid concentrations varied from 5.45 to 21.71 mg/100g. IC50 values ranged from 0.02 ± 0.0004 to 0.13 ± 0.002 mg/ml. All extracts showed very good activity of ferric reducing power, the higher power was in butanol fraction (23.91 mM) more effective than BHA, BHT and VC. Conclusions: Demonstrated this study that the acetonic extract of L. guyonianum contain a considerable quantity of phenolic compounds and possess a good antioxidant activity. Can be used as an easily accessible source of Natural Antioxidants and as a possible food supplement and in the pharmaceutical industry.

Keywords: limoniastrum guyonianum, phenolics compounds, flavonoid compound, antioxidant activity

Procedia PDF Downloads 328
13922 Challenges and Solutions to Human Capital Development in Thailand

Authors: Nhabhat Chaimongkol

Abstract:

Human capital is one of the factors that are vital for economic growth. This is especially true as humans will face increasingly more forms of disruptive technology in the near future. Therefore, there is a need to develop human capital in order to overcome the current uncertainty in the global economy and the future of jobs. In recent years, Thailand has increasingly devoted more attention to developing its human capital. The Thai government has raised this issue in its national agenda, which is part of its 20-year national strategy. Currently, there are multiple challenges and solutions regarding this issue. This study aims to find out what are the challenges and solutions to human capital development in Thailand. The research in this study uses mixed methods consisting of quantitative and qualitative research methods. The results show that while Thailand has many plans to develop human capital, it is still lacking the necessary actions and integrations that are required to achieve its goals. Finally, the challenges and solutions will be discussed in detail.

Keywords: challenges, human capital, solutions, Thailand

Procedia PDF Downloads 155
13921 Determination of Non-CO2 Greenhouse Gas Emission in Electronics Industry

Authors: Bong Jae Lee, Jeong Il Lee, Hyo Su Kim

Abstract:

Both developed and developing countries have adopted the decision to join the Paris agreement to reduce greenhouse gas (GHG) emissions at the Conference of the Parties (COP) 21 meeting in Paris. As a result, the developed and developing countries have to submit the Intended Nationally Determined Contributions (INDC) by 2020, and each country will be assessed for their performance in reducing GHG. After that, they shall propose a reduction target which is higher than the previous target every five years. Therefore, an accurate method for calculating greenhouse gas emissions is essential to be presented as a rational for implementing GHG reduction measures based on the reduction targets. Non-CO2 GHGs (CF4, NF3, N2O, SF6 and so on) are being widely used in fabrication process of semiconductor manufacturing, and etching/deposition process of display manufacturing process. The Global Warming Potential (GWP) value of Non-CO2 is much higher than CO2, which means it will have greater effect on a global warming than CO2. Therefore, GHG calculation methods of the electronics industry are provided by Intergovernmental Panel on climate change (IPCC) and U.S. Environmental Protection Agency (EPA), and it will be discussed at ISO/TC 146 meeting. As discussed earlier, being precise and accurate in calculating Non-CO2 GHG is becoming more important. Thus this study aims to discuss the implications of the calculating methods through comparing the methods of IPCC and EPA. As a conclusion, after analyzing the methods of IPCC & EPA, the method of EPA is more detailed and it also provides the calculation for N2O. In case of the default emission factor (by IPCC & EPA), IPCC provides more conservative results compared to that of EPA; The factor of IPCC was developed for calculating a national GHG emission, while the factor of EPA was specifically developed for the U.S. which means it must have been developed to address the environmental issue of the US. The semiconductor factory ‘A’ measured F gas according to the EPA Destruction and Removal Efficiency (DRE) protocol and estimated their own DRE, and it was observed that their emission factor shows higher DRE compared to default DRE factor of IPCC and EPA Therefore, each country can improve their GHG emission calculation by developing its own emission factor (if possible) at the time of reporting Nationally Determined Contributions (NDC). Acknowledgements: This work was supported by the Korea Evaluation Institute of Industrial Technology (No. 10053589).

Keywords: non-CO2 GHG, GHG emission, electronics industry, measuring method

Procedia PDF Downloads 271
13920 The Effectiveness of Concept Mapping as a Tool for Developing Critical Thinking in Undergraduate Medical Education: A BEME Systematic Review: BEME Guide No. 81

Authors: Marta Fonseca, Pedro Marvão, Beatriz Oliveira, Bruno Heleno, Pedro Carreiro-Martins, Nuno Neuparth, António Rendas

Abstract:

Background: Concept maps (CMs) visually represent hierarchical connections among related ideas. They foster logical organization and clarify idea relationships, potentially aiding medical students in critical thinking (to think clearly and rationally about what to do or what to believe). However, there are inconsistent claims about the use of CMs in undergraduate medical education. Our three research questions are: 1) What studies have been published on concept mapping in undergraduate medical education? 2) What was the impact of CMs on students’ critical thinking? 3) How and why have these interventions had an educational impact? Methods: Eight databases were systematically searched (plus a manual and an additional search were conducted). After eliminating duplicate entries, titles, and abstracts, and full-texts were independently screened by two authors. Data extraction and quality assessment of the studies were independently performed by two authors. Qualitative and quantitative data were integrated using mixed-methods. The results were reported using the structured approach to the reporting in healthcare education of evidence synthesis statement and BEME guidance. Results: Thirty-nine studies were included from 26 journals (19 quantitative, 8 qualitative and 12 mixed-methods studies). CMs were considered as a tool to promote critical thinking, both in the perception of students and tutors, as well as in assessing students’ knowledge and/or skills. In addition to their role as facilitators of knowledge integration and critical thinking, CMs were considered both teaching and learning methods. Conclusions: CMs are teaching and learning tools which seem to help medical students develop critical thinking. This is due to the flexibility of the tool as a facilitator of knowledge integration, as a learning and teaching method. The wide range of contexts, purposes, and variations in how CMs and instruments to assess critical thinking are used increase our confidence that the positive effects are consistent.

Keywords: concept map, medical education, undergraduate, critical thinking, meaningful learning

Procedia PDF Downloads 82
13919 Morphological and Molecular Studies (ITS1) of Hydatid Cysts in Slaughtered Sheep in Mashhad Area

Authors: G. R. Hashemi Tabar, G. R. Razmi, F. Mirshekar

Abstract:

Echinococcus granulosus have ten strains from G1 to G9. Each strain is related to special intermediated host. The morphology, epidemiology, treatment and control in these strains are different. There are many morphological and molecular methods to differentiate of Echinococcus strains. However, using both methods were provided better information about identification of each strain. The aim of study was to identify Echinococcus granulosus strain of hydrated cysts in slaughtered sheep using morphological and molecular methods in Mashhad area. In the present study, the infected liver and lung with hydatid cysts were collected and transferred to laboratory. The hydatid cyst liquid was extracted and morphological characters of rostellar hook protosclocies were measured using micrometer ocular. The total length of large blade length of large hooks, total length of small and blade length of small hooks, and number of hooks per protoscolex were 23± 0.3μm, 11.7±0.5 μm, 19.3±1.1 μm,8±1.1 and 33.7±0.7 μm, respectively. In molecular section of the study, DNA each samples was extracted with MBST Kit and development of PCR using special primers (EgF, EgR) which amplify fragment of ITS1 gen. The PCR product was digested with Bsh1236I enzyme. Based on pattern of PCR-RLFP results (four band forming), G1, G2 and G3 strain of Echinococcus granulosus were obtained. Differentiation of three strains was done using sequencing analysis and G1 strain was diagnosed. The agreement between the molecular results with morphometric characters of rosetellar hook was confirmed the presence of G1 strain of Echinococcus in the slaughtered sheep of Mashhad area.

Keywords: Echinococcus granulosus, Hydatid cyst, PCR, sheep

Procedia PDF Downloads 500
13918 Creative Application of Cognitive Linguistics and Communicative Methods to Eliminate Common Learners' Mistakes in Academic Essay Writing

Authors: Ekaterina Lukianchenko

Abstract:

This article sums up a six-year experience of teaching English as a foreign language to over 900 university students at MGIMO (Moscow University of International Relations, Russia), all of them native speakers of Russian aged 16 to 23. By combining modern communicative approach to teaching with cognitive linguistics theories, one can deal more effectively with deeply rooted mistakes which particular students have of which conventional methods have failed to eliminate. If language items are understood as concepts and frames, and classroom activities as meaningful parts of language competence development, this might help to solve such problems as incorrect use of words, unsuitable register, and confused tenses - as well as logical or structural mistakes, and even certain psychological issues concerning essay writing. Along with classic teaching methods, such classroom practice includes plenty of interaction between students - playing special classroom games aimed at eliminating particular mistakes, working in pairs and groups, integrating all skills in one class. The main conclusions that the author of the experiment makes consist in an assumption that academic essay writing classes demand a balanced plan. This should not only include writing as such, but additionally feature elements of listening, reading, speaking activities specifically chosen according to the skills and language students will need to write the particular type of essay.

Keywords: academic essay writing, creative teaching, cognitive linguistics, competency-based approach, communicative language teaching, frame, concept

Procedia PDF Downloads 279
13917 Evaluating Multiple Diagnostic Tests: An Application to Cervical Intraepithelial Neoplasia

Authors: Areti Angeliki Veroniki, Sofia Tsokani, Evangelos Paraskevaidis, Dimitris Mavridis

Abstract:

The plethora of diagnostic test accuracy (DTA) studies has led to the increased use of systematic reviews and meta-analysis of DTA studies. Clinicians and healthcare professionals often consult DTA meta-analyses to make informed decisions regarding the optimum test to choose and use for a given setting. For example, the human papilloma virus (HPV) DNA, mRNA, and cytology can be used for the cervical intraepithelial neoplasia grade 2+ (CIN2+) diagnosis. But which test is the most accurate? Studies directly comparing test accuracy are not always available, and comparisons between multiple tests create a network of DTA studies that can be synthesized through a network meta-analysis of diagnostic tests (DTA-NMA). The aim is to summarize the DTA-NMA methods for at least three index tests presented in the methodological literature. We illustrate the application of the methods using a real data set for the comparative accuracy of HPV DNA, HPV mRNA, and cytology tests for cervical cancer. A search was conducted in PubMed, Web of Science, and Scopus from inception until the end of July 2019 to identify full-text research articles that describe a DTA-NMA method for three or more index tests. Since the joint classification of the results from one index against the results of another index test amongst those with the target condition and amongst those without the target condition are rarely reported in DTA studies, only methods requiring the 2x2 tables of the results of each index test against the reference standard were included. Studies of any design published in English were eligible for inclusion. Relevant unpublished material was also included. Ten relevant studies were finally included to evaluate their methodology. DTA-NMA methods that have been presented in the literature together with their advantages and disadvantages are described. In addition, using 37 studies for cervical cancer obtained from a published Cochrane review as a case study, an application of the identified DTA-NMA methods to determine the most promising test (in terms of sensitivity and specificity) for use as the best screening test to detect CIN2+ is presented. As a conclusion, different approaches for the comparative DTA meta-analysis of multiple tests may conclude to different results and hence may influence decision-making. Acknowledgment: This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme «Human Resources Development, Education and Lifelong Learning 2014-2020» in the context of the project “Extension of Network Meta-Analysis for the Comparison of Diagnostic Tests ” (MIS 5047640).

Keywords: colposcopy, diagnostic test, HPV, network meta-analysis

Procedia PDF Downloads 124
13916 Real-Time Nonintrusive Heart Rate Measurement: Comparative Case Study of LED Sensorics' Accuracy and Benefits in Heart Monitoring

Authors: Goran Begović

Abstract:

In recent years, many researchers are focusing on non-intrusive measuring methods when it comes to human biosignals. These methods provide solutions for everyday use, whether it’s health monitoring or finessing the workout routine. One of the biggest issues with these solutions is that the sensors’ accuracy is highly variable due to many factors, such as ambiental light, skin color diversity, etc. That is why we wanted to explore different outcomes under those kinds of circumstances in order to find the most optimal algorithm(s) for extracting heart rate (HR) information. The optimization of such algorithms can benefit the wider, cheaper, and safer application of home health monitoring, without having to visit medical professionals as often when it comes to observing heart irregularities. In this study, we explored the accuracy of infrared (IR), red, and green LED sensorics in a controlled environment and compared the results with a medically accurate ECG monitoring device.

Keywords: data science, ECG, heart rate, holter monitor, LED sensors

Procedia PDF Downloads 110
13915 A Review of Effective Gene Selection Methods for Cancer Classification Using Microarray Gene Expression Profile

Authors: Hala Alshamlan, Ghada Badr, Yousef Alohali

Abstract:

Cancer is one of the dreadful diseases, which causes considerable death rate in humans. DNA microarray-based gene expression profiling has been emerged as an efficient technique for cancer classification, as well as for diagnosis, prognosis, and treatment purposes. In recent years, a DNA microarray technique has gained more attraction in both scientific and in industrial fields. It is important to determine the informative genes that cause cancer to improve early cancer diagnosis and to give effective chemotherapy treatment. In order to gain deep insight into the cancer classification problem, it is necessary to take a closer look at the proposed gene selection methods. We believe that they should be an integral preprocessing step for cancer classification. Furthermore, finding an accurate gene selection method is a very significant issue in a cancer classification area because it reduces the dimensionality of microarray dataset and selects informative genes. In this paper, we classify and review the state-of-art gene selection methods. We proceed by evaluating the performance of each gene selection approach based on their classification accuracy and number of informative genes. In our evaluation, we will use four benchmark microarray datasets for the cancer diagnosis (leukemia, colon, lung, and prostate). In addition, we compare the performance of gene selection method to investigate the effective gene selection method that has the ability to identify a small set of marker genes, and ensure high cancer classification accuracy. To the best of our knowledge, this is the first attempt to compare gene selection approaches for cancer classification using microarray gene expression profile.

Keywords: gene selection, feature selection, cancer classification, microarray, gene expression profile

Procedia PDF Downloads 433
13914 A Descriptive Study on Comparison of Maternal and Perinatal Outcome of Twin Pregnancies Conceived Spontaneously and by Assisted Conception Methods

Authors: Aishvarya Gupta, Keerthana Anand, Sasirekha Rengaraj, Latha Chathurvedula

Abstract:

Introduction: Advances in assisted reproductive technology and increase in the proportion of infertile couples have both contributed to the steep increase in the incidence of twin pregnancies in past decades. Maternal and perinatal complications are higher in twins than in singleton pregnancies. Studies comparing the maternal and perinatal outcomes of ART twin pregnancies versus spontaneously conceived twin pregnancies report heterogeneous results making it unclear whether the complications are due to twin gestation per se or because of assisted reproductive techniques. The present study aims to compare both maternal and perinatal outcomes in twin pregnancies which are spontaneously conceived and after assisted conception methods, so that targeted steps can be undertaken in order to improve maternal and perinatal outcome of twins. Objectives: To study perinatal and maternal outcome in twin pregnancies conceived spontaneously as well as with assisted methods and compare the outcomes between the two groups. Setting: Women delivering at JIPMER (tertiary care institute), Pondicherry. Population: 380 women with twin pregnancies who delivered in JIPMER between June 2015 and March 2017 were included in the study. Methods: The study population was divided into two cohorts – one conceived by spontaneous conception and other by assisted reproductive methods. Association of various maternal and perinatal outcomes with the method of conception was assessed using chi square test or Student's t test as appropriate. Multiple logistic regression analysis was done to assess the independent association of assisted conception with maternal outcomes after adjusting for age, parity and BMI. Multiple logistic regression analysis was done to assess the independent association of assisted conception with perinatal outcomes after adjusting for age, parity, BMI, chorionicity, gestational age at delivery and presence of hypertension or gestational diabetes in the mother. A p value of < 0.05 was considered as significant. Result: There was increased proportion of women with GDM (21% v/s 4.29%) and premature rupture of membranes (35% v/s 22.85%) in the assisted conception group and more anemic women in the spontaneous group (71.27% v/s 55.1%). However assisted conception per se increased the incidence of GDM among twin gestations (OR 3.39, 95% CI 1.34 – 8.61) and did not influence any of the other maternal outcomes. Among the perinatal outcomes, assisted conception per se increased the risk of having very preterm (<32 weeks) neonates (OR 3.013, 95% CI 1.432 – 6.337). The mean birth weight did not significantly differ between the two groups (p = 0.429). Though there were higher proportion of babies admitted to NICU in the assisted conception group (48.48% v/s 36.43%), assisted conception per se did not increase the risk of admission to NICU (OR 1.23, 95% CI 0.76 – 1.98). There was no significant difference in perinatal mortality rates between the two groups (p = 0.829). Conclusion: Assisted conception per se increases the risk of developing GDM in women with twin gestation and increases the risk of delivering very preterm babies. Hence measures should be taken to ensure appropriate screening methods for GDM and suitable neonatal care in such pregnancies.

Keywords: assisted conception, maternal outcomes, perinatal outcomes, twin gestation

Procedia PDF Downloads 187
13913 A New Approach to Image Stitching of Radiographic Images

Authors: Somaya Adwan, Rasha Majed, Lamya'a Majed, Hamzah Arof

Abstract:

In order to produce images with whole body parts, X-ray of different portions of the body parts is assembled using image stitching methods. A new method for image stitching that exploits mutually feature based method and direct based method to identify and merge pairs of X-ray medical images is presented in this paper. The performance of the proposed method based on this hybrid approach is investigated in this paper. The ability of the proposed method to stitch and merge the overlapping pairs of images is demonstrated. Our proposed method display comparable if not superior performance to other feature based methods that are mentioned in the literature on the standard databases. These results are promising and demonstrate the potential of the proposed method for further development to tackle more advanced stitching problems.

Keywords: image stitching, direct based method, panoramic image, X-ray

Procedia PDF Downloads 525
13912 Behaviour of Non-local Correlations and Quantum Information Theoretic Measures in Frustrated Molecular Wheels

Authors: Amit Tribedi

Abstract:

Genuine Quantumness present in Quantum Systems is the resource for implementing Quantum Information and Computation Protocols which can outperform the classical counterparts. These Quantumness measures encompass non-local ones known as quantum entanglement (QE) and quantum information theoretic (QIT) ones, e.g. Quantum Discord (QD). In this paper, some well-known measures of QE and QD in some wheel-like frustrated molecular magnetic systems have been studied. One of the systems has already been synthesized using coordination chemistry, and the other is hypothetical, where the dominant interaction is the spin-spin exchange interaction. Exact analytical methods and exact numerical diagonalization methods have been used. Some counter-intuitive non-trivial features, like non-monotonicity of quantum correlations with temperature, persistence of multipartite entanglement over bipartite ones etc. indicated by the behaviour of the correlations and the QIT measures have been found. The measures, being operational ones, can be used to realize the resource of Quantumness in experiments.

Keywords: 0D Magnets, discord, entanglement, frustration

Procedia PDF Downloads 212
13911 Slope Stability and Landslides Hazard Analysis, Limitations of Existing Approaches, and a New Direction

Authors: Alisawi Alaa T., Collins P. E. F.

Abstract:

The analysis and evaluation of slope stability and landslide hazards are landslide hazards are critically important in civil engineering projects and broader considerations of safety. The level of slope stability risk should be identified due to its significant and direct financial and safety effects. Slope stability hazard analysis is performed considering static and/or dynamic loading circumstances. To reduce and/or prevent the failure hazard caused by landslides, a sophisticated and practical hazard analysis method using advanced constitutive modeling should be developed and linked to an effective solution that corresponds to the specific type of slope stability and landslides failure risk. Previous studies on slope stability analysis methods identify the failure mechanism and its corresponding solution. The commonly used approaches include used approaches include limit equilibrium methods, empirical approaches for rock slopes (e.g., slope mass rating and Q-slope), finite element or finite difference methods, and district element codes. This study presents an overview and evaluation of these analysis techniques. Contemporary source materials are used to examine these various methods on the basis of hypotheses, the factor of safety estimation, soil types, load conditions, and analysis conditions and limitations. Limit equilibrium methods play a key role in assessing the level of slope stability hazard. The slope stability safety level can be defined by identifying the equilibrium of the shear stress and shear strength. The slope is considered stable when the movement resistance forces are greater than those that drive the movement with a factor of safety (ratio of the resistance of the resistance of the driving forces) that is greater than 1.00. However, popular and practical methods, including limit equilibrium approaches, are not effective when the slope experiences complex failure mechanisms, such as progressive failure, liquefaction, internal deformation, or creep. The present study represents the first episode of an ongoing project that involves the identification of the types of landslides hazards, assessment of the level of slope stability hazard, development of a sophisticated and practical hazard analysis method, linkage of the failure type of specific landslides conditions to the appropriate solution and application of an advanced computational method for mapping the slope stability properties in the United Kingdom, and elsewhere through geographical information system (GIS) and inverse distance weighted spatial interpolation(IDW) technique. This study investigates and assesses the different assesses the different analysis and solution techniques to enhance the knowledge on the mechanism of slope stability and landslides hazard analysis and determine the available solutions for each potential landslide failure risk.

Keywords: slope stability, finite element analysis, hazard analysis, landslides hazard

Procedia PDF Downloads 84
13910 Mathematical Modeling for Diabetes Prediction: A Neuro-Fuzzy Approach

Authors: Vijay Kr. Yadav, Nilam Rathi

Abstract:

Accurate prediction of glucose level for diabetes mellitus is required to avoid affecting the functioning of major organs of human body. This study describes the fundamental assumptions and two different methodologies of the Blood glucose prediction. First is based on the back-propagation algorithm of Artificial Neural Network (ANN), and second is based on the Neuro-Fuzzy technique, called Fuzzy Inference System (FIS). Errors between proposed methods further discussed through various statistical methods such as mean square error (MSE), normalised mean absolute error (NMAE). The main objective of present study is to develop mathematical model for blood glucose prediction before 12 hours advanced using data set of three patients for 60 days. The comparative studies of the accuracy with other existing models are also made with same data set.

Keywords: back-propagation, diabetes mellitus, fuzzy inference system, neuro-fuzzy

Procedia PDF Downloads 239
13909 Parameters Tuning of a PID Controller on a DC Motor Using Honey Bee and Genetic Algorithms

Authors: Saeid Jalilzadeh

Abstract:

PID controllers are widely used to control the industrial plants because of their robustness and simple structures. Tuning of the controller's parameters to get a desired response is difficult and time consuming. With the development of computer technology and artificial intelligence in automatic control field, all kinds of parameters tuning methods of PID controller have emerged in endlessly, which bring much energy for the study of PID controller, but many advanced tuning methods behave not so perfect as to be expected. Honey Bee algorithm (HBA) and genetic algorithm (GA) are extensively used for real parameter optimization in diverse fields of study. This paper describes an application of HBA and GA to the problem of designing a PID controller whose parameters comprise proportionality constant, integral constant and derivative constant. Presence of three parameters to optimize makes the task of designing a PID controller more challenging than conventional P, PI, and PD controllers design. The suitability of the proposed approach has been demonstrated through computer simulation using MATLAB/SIMULINK.

Keywords: controller, GA, optimization, PID, PSO

Procedia PDF Downloads 524
13908 Optimized Deep Learning-Based Facial Emotion Recognition System

Authors: Erick C. Valverde, Wansu Lim

Abstract:

Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.

Keywords: deep learning, face detection, facial emotion recognition, network optimization methods

Procedia PDF Downloads 103
13907 Sensory Gap Analysis on Port Wine Promotion and Perceptions

Authors: José Manue Carvalho Vieira, Mariana Magalhães, Elizabeth Serra

Abstract:

The Port Wine industry is essential to Portugal because it carries a tangible cultural heritage and for social and economic reasons. Positioned as a luxury product, brands need to pay more attention to the new generation's habits, preferences, languages, and sensory perceptions. Healthy lifestyles, anti-alcohol campaigns, and digitalisation of their buying decision process need to be better understood to understand the wine market in the future. The purpose of this study is to clarify the sensory perception gap between Port Wine descriptors promotion and the new generation's perceptions to help wineries to align their strategies. Based on the interpretivist approach - multiple methods and techniques (mixed-methods), different world views and different assumptions, and different data collection methods and analysis, this research integrated qualitative semi-structured interviews, Port Wine promotion contents, and social media perceptions mined by Sentiment Analysis Enginius algorithm. Findings confirm that Port Wine CEOs' strategies, brands' promotional content, and social perceptions are not sufficiently aligned. The central insight for Port Wine brands' managers is that there is a long and continuous work of understanding and associating their descriptors with the most relevant perceptual values and criteria of their targets to reposition (when necessary) and sustainably revitalise their brands. Finally, this study hypothesised a sensory gap that leads to a decrease in consumption, trying to find recommendations on how to transform it into an advantage for a better attraction towards the young age group (18-25).

Keywords: port wine, consumer habits, sensory gap analysis, wine marketing

Procedia PDF Downloads 224