Search results for: comprehensive feature extraction
4757 Lexical Semantic Analysis to Support Ontology Modeling of Maintenance Activities– Case Study of Offshore Riser Integrity
Authors: Vahid Ebrahimipour
Abstract:
Word representation and context meaning of text-based documents play an essential role in knowledge modeling. Business procedures written in natural language are meant to store technical and engineering information, management decision and operation experience during the production system life cycle. Context meaning representation is highly dependent upon word sense, lexical relativity, and sematic features of the argument. This paper proposes a method for lexical semantic analysis and context meaning representation of maintenance activity in a mass production system. Our approach constructs a straightforward lexical semantic approach to analyze facilitates semantic and syntactic features of context structure of maintenance report to facilitate translation, interpretation, and conversion of human-readable interpretation into computer-readable representation and understandable with less heterogeneity and ambiguity. The methodology will enable users to obtain a representation format that maximizes shareability and accessibility for multi-purpose usage. It provides a contextualized structure to obtain a generic context model that can be utilized during the system life cycle. At first, it employs a co-occurrence-based clustering framework to recognize a group of highly frequent contextual features that correspond to a maintenance report text. Then the keywords are identified for syntactic and semantic extraction analysis. The analysis exercises causality-driven logic of keywords’ senses to divulge the structural and meaning dependency relationships between the words in a context. The output is a word contextualized representation of maintenance activity accommodating computer-based representation and inference using OWL/RDF.Keywords: lexical semantic analysis, metadata modeling, contextual meaning extraction, ontology modeling, knowledge representation
Procedia PDF Downloads 1054756 A Scientific Method of Drug Development Based on Ayurvedic Bhaishajya Knowledge
Authors: Rajesh S. Mony, Vaidyaratnam Oushadhasala
Abstract:
An attempt is made in this study to evolve a drug development modality based on classical Ayurvedic knowledge base as well as on modern scientific methodology. The present study involves (a) identification of a specific ailment condition, (b) the selection of a polyherbal formulation, (c) deciding suitable extraction procedure, (d) confirming the efficacy of the combination by in-vitro trials and (e) fixing up the recommended dose. The ailment segment selected is arthritic condition. The selected herbal combination is Kunturushka, Vibhitaki, Guggulu, Haridra, Maricha and Nirgundi. They were selected as per Classical Ayurvedic references, Authentified as per API (Ayurvedic Pharmacopeia of India), Extraction of each drug was done by different ratios of Hydroalcoholic menstrums, Invitro assessment of each extract after removing residual solvent for anti-Inflammatory, anti-arthritic activities (by UV-Vis. Spectrophotometer with positive control), Invitro assessment of each extract for COX enzyme inhibition (by UV-Vis. Spectrophotometer with positive control), Selection of the extracts was made having good in-vitro activity, Performed the QC testing of each selected extract including HPTLC, that is the in process QC specifications, h. Decision of the single dose with mixtures of selected extracts was made as per the level of in-vitro activity and available toxicology data, Quantification of major groups like Phenolics, Flavonoids, Alkaloids and Bitters was done with both standard Spectrophotometric and Gravimetric methods, Method for Marker assay was developed and validated by HPTLC and a good resolved HPTLC finger print was developed for the single dosage API (Active Pharmaceutical Ingredient mixture of extracts), Three batches was prepared to fix the in process and API (Active Pharmaceutical Ingredient) QC specifications.Keywords: drug development, antiinflammatory, quality stardardisation, planar chromatography
Procedia PDF Downloads 994755 Methodology for the Determination of Triterpenic Compounds in Apple Extracts
Authors: Mindaugas Liaudanskas, Darius Kviklys, Kristina Zymonė, Raimondas Raudonis, Jonas Viškelis, Norbertas Uselis, Pranas Viškelis, Valdimaras Janulis
Abstract:
Apples are among the most commonly consumed fruits in the world. Based on data from the year 2014, approximately 84.63 million tons of apples are grown per annum. Apples are widely used in food industry to produce various products and drinks (juice, wine, and cider); they are also used unprocessed. Apples in human diet are an important source of different groups of biological active compounds that can positively contribute to the prevention of various diseases. They are a source of various biologically active substances – especially vitamins, organic acids, micro- and macro-elements, pectins, and phenolic, triterpenic, and other compounds. Triterpenic compounds, which are characterized by versatile biological activity, are the biologically active compounds found in apples that are among the most promising and most significant for human health. A specific analytical procedure including sample preparation and High Performance Liquid Chromatography (HPLC) analysis was developed, optimized, and validated for the detection of triterpenic compounds in the samples of different apples, their peels, and flesh from widespread apple cultivars 'Aldas', 'Auksis', 'Connel Red', 'Ligol', 'Lodel', and 'Rajka' grown in Lithuanian climatic conditions. The conditions for triterpenic compound extraction were optimized: the solvent of the extraction was 100% (v/v) acetone, and the extraction was performed in an ultrasound bath for 10 min. Isocratic elution (the eluents ratio being 88% (solvent A) and 12% (solvent B)) for a rapid separation of triterpenic compounds was performed. The validation of the methodology was performed on the basis of the ICH recommendations. The following characteristics of validation were evaluated: the selectivity of the method (specificity), precision, the detection and quantitation limits of the analytes, and linearity. The obtained parameters values confirm suitability of methodology to perform analysis of triterpenic compounds. Using the optimised and validated HPLC technique, four triterpenic compounds were separated and identified, and their specificity was confirmed. These compounds were corosolic acid, betulinic acid, oleanolic acid, and ursolic acid. Ursolic acid was the dominant compound in all the tested apple samples. The detected amount of betulinic acid was the lowest of all the identified triterpenic compounds. The greatest amounts of triterpenic compounds were detected in whole apple and apple peel samples of the 'Lodel' cultivar, and thus apples and apple extracts of this cultivar are potentially valuable for use in medical practice, for the prevention of various diseases, for adjunct therapy, for the isolation of individual compounds with a specific biological effect, and for the development and production of dietary supplements and functional food enriched in biologically active compounds. Acknowledgements. This work was supported by a grant from the Research Council of Lithuania, project No. MIP-17-8.Keywords: apples, HPLC, triterpenic compounds, validation
Procedia PDF Downloads 1734754 Technology Road Mapping in the Fourth Industrial Revolution: A Comprehensive Analysis and Strategic Framework
Authors: Abdul Rahman Hamdan
Abstract:
The Fourth Industrial Revolution (4IR) has brought unprecedented technological advancements that have disrupted many industries worldwide. In keeping up with the technological advances and rapid disruption by the introduction of many technological advancements brought forth by the 4IR, the use of technology road mapping has emerged as one of the critical tools for organizations to leverage. Technology road mapping can be used by many companies to guide them to become more adaptable and anticipate future transformation and innovation, and avoid being redundant or irrelevant due to the rapid changes in technological advancement. This research paper provides a comprehensive analysis of technology road mapping within the context of the 4IR. The objectives of the paper are to provide companies with practical insights and a strategic framework of technology road mapping for them to navigate the fast-changing nature of the 4IR. This study also contributes to the understanding and practice of technology road mapping in the 4IR and, at the same time, provides organizations with the necessary tools and critical insight to navigate the 4IR transformation by leveraging technology road mapping. Based on the literature review and case studies, the study analyses key principles, methodologies, and best practices in technology road mapping and integrates them with the unique characteristics and challenges of the 4IR. The research paper gives the background of the fourth industrial revolution. It explores the disruptive potential of technologies in the 4IR and the critical need for technology road mapping that consists of strategic planning and foresight to remain competitive and relevant in the 4IR era. It also highlights the importance of technology road mapping as an organisation’s proactive approach to align the organisation’s objectives and resources to their technology and product development in meeting the fast-evolving technological 4IR landscape. The paper also includes the theoretical foundations of technology road mapping and examines various methodological approaches, and identifies external stakeholders in the process, such as external experts, stakeholders, collaborative platforms, and cross-functional teams to ensure an integrated and robust technological roadmap for the organisation. Moreover, this study presents a comprehensive framework for technology road mapping in the 4IR by incorporating key elements and processes such as technology assessment, competitive intelligence, risk analysis, and resource allocation. It provides a framework for implementing technology road mapping from strategic planning, goal setting, and technology scanning to road mapping visualisation, implementation planning, monitoring, and evaluation. In addition, the study also addresses the challenges and limitations related to technology roadmapping in 4IR, including the gap analysis. In conclusion of the study, the study will propose a set of practical recommendations for organizations that intend to leverage technology road mapping as a strategic tool in the 4IR in driving innovation and becoming competitive in the current and future ecosystem.Keywords: technology management, technology road mapping, technology transfer, technology planning
Procedia PDF Downloads 694753 Thermodynamic Evaluation of Coupling APR-1400 with a Thermal Desalination Plant
Authors: M. Gomaa Abdoelatef, Robert M. Field, Lee, Yong-Kwan
Abstract:
Growing human populations have placed increased demands on water supplies and a heightened interest in desalination infrastructure. Key elements of the economics of desalination projects are thermal and electrical inputs. With growing concerns over the use of fossil fuels to (indirectly) supply these inputs, coupling of desalination with nuclear power production represents a significant opportunity. Individually, nuclear and desalination technologies have a long history and are relatively mature. For desalination, Reverse Osmosis (RO) has the lowest energy inputs. However, the economically driven output quality of the water produced using RO, which uses only electrical inputs, is lower than the output water quality from thermal desalination plants. Therefore, modern desalination projects consider that RO should be coupled with thermal desalination technologies (MSF, MED, or MED-TVC) with attendant steam inputs to permit blending to produce various qualities of water. A large nuclear facility is well positioned to dispatch large quantities of both electrical and thermal power. This paper considers the supply of thermal energy to a large desalination facility to examine heat balance impact on the nuclear steam cycle. The APR1400 nuclear plant is selected as prototypical from both a capacity and turbine cycle heat balance perspective to examine steam supply and the impact on electrical output. Extraction points and quantities of steam are considered parametrically along with various types of thermal desalination technologies to form the basis for further evaluations of economically optimal approaches to the interface of nuclear power production with desalination projects. In our study, the thermodynamic evaluation will be executed by DE-TOP which is the IAEA desalination program, it is approved to be capable of analyzing power generation systems coupled to desalination systems through various steam extraction positions, taking into consideration the isolation loop between the APR-1400 and the thermal desalination plant for safety concern.Keywords: APR-1400, desalination, DE-TOP, IAEA, MSF, MED, MED-TVC, RO
Procedia PDF Downloads 5324752 Thorium Extraction with Cyanex272 Coated Magnetic Nanoparticles
Authors: Afshin Shahbazi, Hadi Shadi Naghadeh, Ahmad Khodadadi Darban
Abstract:
In the Magnetically Assisted Chemical Separation (MACS) process, tiny ferromagnetic particles coated with solvent extractant are used to selectively separate radionuclides and hazardous metals from aqueous waste streams. The contaminant-loaded particles are then recovered from the waste solutions using a magnetic field. In the present study, Cyanex272 or C272 (bis (2,4,4-trimethylpentyl) phosphinic acid) coated magnetic particles are being evaluated for the possible application in the extraction of Thorium (IV) from nuclear waste streams. The uptake behaviour of Th(IV) from nitric acid solutions was investigated by batch studies. Adsorption of Thorium (IV) from aqueous solution onto adsorbent was investigated in a batch system. Adsorption isotherm and adsorption kinetic studies of Thorium (IV) onto nanoparticles coated Cyanex272 were carried out in a batch system. The factors influencing Thorium (IV) adsorption were investigated and described in detail, as a function of the parameters such as initial pH value, contact time, adsorbent mass, and initial Thorium (IV) concentration. Magnetically Assisted Chemical Separation (MACS) process adsorbent showed best results for the fast adsorption of Th (IV) from aqueous solution at aqueous phase acidity value of 0.5 molar. In addition, more than 80% of Th (IV) was removed within the first 2 hours, and the time required to achieve the adsorption equilibrium was only 140 minutes. Langmuir and Frendlich adsorption models were used for the mathematical description of the adsorption equilibrium. Equilibrium data agreed very well with the Langmuir model, with a maximum adsorption capacity of 48 mg.g-1. Adsorption kinetics data were tested using pseudo-first-order, pseudo-second-order and intra-particle diffusion models. Kinetic studies showed that the adsorption followed a pseudo-second-order kinetic model, indicating that the chemical adsorption was the rate-limiting step.Keywords: Thorium (IV) adsorption, MACS process, magnetic nanoparticles, Cyanex272
Procedia PDF Downloads 3394751 An Approach to Solving Some Inverse Problems for Parabolic Equations
Authors: Bolatbek Rysbaiuly, Aliya S. Azhibekova
Abstract:
Problems concerning the interpretation of the well testing results belong to the class of inverse problems of subsurface hydromechanics. The distinctive feature of such problems is that additional information is depending on the capabilities of oilfield experiments. Another factor that should not be overlooked is the existence of errors in the test data. To determine reservoir properties, some inverse problems for parabolic equations were investigated. An approach to solving the inverse problems based on the method of regularization is proposed.Keywords: iterative approach, inverse problem, parabolic equation, reservoir properties
Procedia PDF Downloads 4284750 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes
Procedia PDF Downloads 414749 Deep Learning for Image Correction in Sparse-View Computed Tomography
Authors: Shubham Gogri, Lucia Florescu
Abstract:
Medical diagnosis and radiotherapy treatment planning using Computed Tomography (CT) rely on the quantitative accuracy and quality of the CT images. At the same time, requirements for CT imaging include reducing the radiation dose exposure to patients and minimizing scanning time. A solution to this is the sparse-view CT technique, based on a reduced number of projection views. This, however, introduces a new problem— the incomplete projection data results in lower quality of the reconstructed images. To tackle this issue, deep learning methods have been applied to enhance the quality of the sparse-view CT images. A first approach involved employing Mir-Net, a dedicated deep neural network designed for image enhancement. This showed promise, utilizing an intricate architecture comprising encoder and decoder networks, along with the incorporation of the Charbonnier Loss. However, this approach was computationally demanding. Subsequently, a specialized Generative Adversarial Network (GAN) architecture, rooted in the Pix2Pix framework, was implemented. This GAN framework involves a U-Net-based Generator and a Discriminator based on Convolutional Neural Networks. To bolster the GAN's performance, both Charbonnier and Wasserstein loss functions were introduced, collectively focusing on capturing minute details while ensuring training stability. The integration of the perceptual loss, calculated based on feature vectors extracted from the VGG16 network pretrained on the ImageNet dataset, further enhanced the network's ability to synthesize relevant images. A series of comprehensive experiments with clinical CT data were conducted, exploring various GAN loss functions, including Wasserstein, Charbonnier, and perceptual loss. The outcomes demonstrated significant image quality improvements, confirmed through pertinent metrics such as Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) between the corrected images and the ground truth. Furthermore, learning curves and qualitative comparisons added evidence of the enhanced image quality and the network's increased stability, while preserving pixel value intensity. The experiments underscored the potential of deep learning frameworks in enhancing the visual interpretation of CT scans, achieving outcomes with SSIM values close to one and PSNR values reaching up to 76.Keywords: generative adversarial networks, sparse view computed tomography, CT image correction, Mir-Net
Procedia PDF Downloads 1624748 The Role of Leadership in Enhancing Health Information Systems to Improve Patient Outcomes in China
Authors: Nisar Ahmad, Xuyi, Ali Akbar
Abstract:
As healthcare systems worldwide strive for improvement, the integration of advanced health information systems (HIS) has emerged as a pivotal strategy. This study aims to investigate the critical role of leadership in the implementation and enhancement of HIS in Chinese hospitals and how such leadership can drive improvements in patient outcomes and overall healthcare satisfaction. We propose a comprehensive study to be conducted across various hospitals in China, targeting healthcare professionals as the primary population. The research will leverage established theories of transformational leadership and technology acceptance to underpin the analysis. In our approach, data will be meticulously gathered through surveys and interviews, focusing on the experiences and perceptions of healthcare professionals regarding HIS implementation and its impact on patient care. The study will utilize SPSS and SmartPLS software for robust data analysis, ensuring precise and comprehensive insights into the correlation between leadership effectiveness and HIS success. We hypothesize that strong, visionary leadership is essential for the successful adoption and optimization of HIS, leading to enhanced patient outcomes and increased satisfaction with healthcare services. By applying advanced statistical methods, we aim to identify key leadership traits and practices that significantly contribute to these improvements. Our research will provide actionable insights for policymakers and healthcare administrators in China, offering evidence-based recommendations to foster leadership that champions HIS and drives continuous improvement in healthcare delivery. This study will contribute to the global discourse on health information systems, emphasizing the future role of leadership in transforming healthcare environments and outcomes.Keywords: health information systems, leadership, patient outcomes, healthcare satisfaction
Procedia PDF Downloads 364747 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling
Authors: Masoud Safdari, Jacob Fish
Abstract:
Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.Keywords: atomistic, continuum, coupling, multiscale
Procedia PDF Downloads 1774746 Exploring the Applications of Neural Networks in the Adaptive Learning Environment
Authors: Baladitya Swaika, Rahul Khatry
Abstract:
Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.Keywords: computer adaptive tests, item response theory, machine learning, neural networks
Procedia PDF Downloads 1754745 Development of Programmed Cell Death Protein 1 Pathway-Associated Prognostic Biomarkers for Bladder Cancer Using Transcriptomic Databases
Authors: Shu-Pin Huang, Pai-Chi Teng, Hao-Han Chang, Chia-Hsin Liu, Yung-Lun Lin, Shu-Chi Wang, Hsin-Chih Yeh, Chih-Pin Chuu, Jiun-Hung Geng, Li-Hsin Chang, Wei-Chung Cheng, Chia-Yang Li
Abstract:
The emergence of immune checkpoint inhibitors (ICIs) targeting proteins like PD-1 and PD-L1 has changed the treatment paradigm of bladder cancer. However, not all patients benefit from ICIs, with some experiencing early death. There's a significant need for biomarkers associated with the PD-1 pathway in bladder cancer. Current biomarkers focus on tumor PD-L1 expression, but a more comprehensive understanding of PD-1-related biology is needed. Our study has developed a seven-gene risk score panel, employing a comprehensive bioinformatics strategy, which could serve as a potential prognostic and predictive biomarker for bladder cancer. This panel incorporates the FYN, GRAP2, TRIB3, MAP3K8, AKT3, CD274, and CD80 genes. Additionally, we examined the relationship between this panel and immune cell function, utilizing validated tools such as ESTIMATE, TIDE, and CIBERSORT. Our seven-genes panel has been found to be significantly associated with bladder cancer survival in two independent cohorts. The panel was also significantly correlated with tumor infiltration lymphocytes, immune scores, and tumor purity. These factors have been previously reported to have clinical implications on ICIs. The findings suggest the potential of a PD-1 pathway-based transcriptomic panel as a prognostic and predictive biomarker in bladder cancer, which could help optimize treatment strategies and improve patient outcomes.Keywords: bladder cancer, programmed cell death protein 1, prognostic biomarker, immune checkpoint inhibitors, predictive biomarker
Procedia PDF Downloads 784744 Economic Impact and Benefits of Integrating Augmented Reality Technology in the Healthcare Industry: A Systematic Review
Authors: Brenda Thean I. Lim, Safurah Jaafar
Abstract:
Augmented reality (AR) in the healthcare industry has been gaining popularity in recent years, principally in areas of medical education, patient care and digital health solutions. One of the drivers in deciding to invest in AR technology is the potential economic benefits it could bring for patients and healthcare providers, including the pharmaceutical and medical technology sectors. Works of literature have shown that the benefits and impact of AR technologies have left trails of achievements in improving medical education and patient health outcomes. However, little has been published on the economic impact of AR in healthcare, a very resource-intensive industry. This systematic review was performed on studies focused on the benefits and impact of AR in healthcare to appraise if they meet the founded quality criteria so as to identify relevant publications for an in-depth analysis of the economic impact assessment. The literature search was conducted using multiple databases such as PubMed, Cochrane, Science Direct and Nature. Inclusion criteria include research papers on AR implementation in healthcare, from education to diagnosis and treatment. Only papers written in English language were selected. Studies on AR prototypes were excluded. Although there were many articles that have addressed the benefits of AR in the healthcare industry in the area of medical education, treatment and diagnosis and dental medicine, there were very few publications that identified the specific economic impact of technology within the healthcare industry. There were 13 publications included in the analysis based on the inclusion criteria. Out of the 13 studies, none comprised a systematically comprehensive cost impact evaluation. An outline of the cost-effectiveness and cost-benefit framework was made based on an AR article from another industry as a reference. This systematic review found that while the advancements of AR technology is growing rapidly and industries are starting to adopt them into respective sectors, the technology and its advancements in healthcare were still in their early stages. There are still plenty of room for further advancements and integration of AR into different sectors within the healthcare industry. Future studies will require more comprehensive economic analyses and costing evaluations to enable economic decisions for or against implementing AR technology in healthcare. This systematic review concluded that the current literature lacked detailed examination and conduct of economic impact and benefit analyses. Recommendations for future research would be to include details of the initial investment and operational costs for the AR infrastructure in healthcare settings while comparing the intervention to its conventional counterparts or alternatives so as to provide a comprehensive comparison on impact, benefit and cost differences.Keywords: augmented reality, benefit, economic impact, healthcare, patient care
Procedia PDF Downloads 2074743 Exploring the Underlying Factors of Student Dropout in Makawanpur Multiple Campus: A Comprehensive Analysis
Authors: Uttam Aryal, Shekhar Thapaliya
Abstract:
This research paper presents a comprehensive analysis of the factors contributing to student dropout at Makawanpur Multiple Campus, utilizing primary data collected directly from dropped out as well as regular students and academic staff. Employing a mixed-method approach, combining qualitative and quantitative methods, this study examines into the complicated issue of student dropout. Data collection methods included surveys, interviews, and a thorough examination of academic records covering multiple academic years. The study focused on students who left their programs prematurely, as well as current students and academic staff, providing a well-rounded perspective on the issue. The analysis reveals a shaded understanding of the factors influencing student dropout, encompassing both academic and non-academic dimensions. These factors include academic challenges, personal choices, socioeconomic barriers, peer influences, and institutional-related issues. Importantly, the study highlights the most influential factors for dropout, such as the pursuit of education abroad, financial restrictions, and employment opportunities, shedding light on the complex web of circumstances that lead students to discontinue their education. The insights derived from this study offer actionable recommendations for campus administrators, policymakers, and educators to develop targeted interventions aimed at reducing dropout rates and improving student retention. The study underscores the importance of addressing the diverse needs and challenges faced by students, with the ultimate goal of fostering a supportive academic environment that encourages student success and program completion.Keywords: drop out, students, factors, opportunities, challenges
Procedia PDF Downloads 654742 The Effects of Peer Education on Condom Use Intentions: A Comprehensive Sex Education Quality Improvement Project
Authors: Janell Jayamohan
Abstract:
A pilot project based on the Theory of Planned Behavior was completed at a single sex female international high school in order to improve the quality of comprehensive sex education in a 12th grade classroom. The student sample is representative of a growing phenomenon of “Third Culture Kids” or global nomads; often in today’s world, culture transcends any one dominant influence and blends values from multiple sources. The Objective was to improve intentions of condom use during the students’ first or next intercourse. A peer-education session which focused on condom attitudes, social norms, and self-efficacy - central tenets of the Theory of Planned Behavior - was added to an existing curriculum in order to achieve this objective. Peer educators were given liberty of creating and executing the lesson to their homeroom, a sample of 23 senior students, with minimal intervention from faculty, the desired outcome being that the students themselves would be the best judge of what is culturally relevant and important to their peers. The school nurse and school counselor acted as faculty facilitators but did not assist in the creation or delivery of the lesson, only checked for medical accuracy. The participating sample of students completed a pre and post-test with validated questions assessing changes in attitudes and overall satisfaction with the peer education lesson. As this intervention was completed during the Covid-19 pandemic, the peer education session was completed in a virtual classroom environment, limiting the modes of information delivery available to the peer educators, but is planned to be replicated in an in-person environment in subsequent cycles.Keywords: adolescents, condoms, peer education, sex education, theory of planned behavior, third culture kids
Procedia PDF Downloads 1294741 Characteristics and Feature Analysis of PCF Labeling among Construction Materials
Authors: Sung-mo Seo, Chang-u Chae
Abstract:
The Product Carbon Footprint Labeling has been run for more than four years by the Ministry of Environment and there are number of products labeled by KEITI, as for declaring products with their carbon emission during life cycle stages. There are several categories for certifying products by the characteristics of usage. Building products which are applied to a building as combined components. In this paper, current status of PCF labeling has been compared with LCI DB for data composition. By this comparative analysis, we suggest carbon labeling development.Keywords: carbon labeling, LCI DB, building materials, life cycle assessment
Procedia PDF Downloads 4214740 Edible and Ecofriendly Packaging – A Trendsetter of the Modern Era – Standardization and Properties of Films and Cutleries from Food Starch
Authors: P. Raajeswari, S. M. Devatha, R. Pragatheeswari
Abstract:
The edible packaging is a new trendsetter in the era of modern packaging. The researchers and food scientist recognise edible packaging as a useful alternative or addition to conventional packaging to reduce waste and to create novel applications for improving product stability. Starch was extracted from different sources that contains abundantly like potato, tapioca, rice, wheat, and corn. The starch based edible films and cutleries are developed as an alternative for conventional packages providing the nutritional benefit when consumed along with the food. The development of starch based edible films by the extraction of starch from various raw ingredients at lab scale level. The films are developed by the employment of plasticiser at different concentrations of 1.5ml and 2ml. The films developed using glycerol as a plasticiser in filmogenic solution to increase the flexibility and plasticity of film. It reduces intra and intermolecular forces in starch, and it increases the mobility of starch based edible films. The films developed are tested for its functional properties such as thickness, tensile strength, elongation at break, moisture permeability, moisture content, and puncture strength. The cutleries like spoons and cups are prepared by making dough and rolling the starch along with water. The overall results showed that starch based edible films absorbed less moisture, and they also contributed to the low moisture permeability with high tensile strength. Food colorants extracted from red onion peel, pumpkin, and red amaranth adds on the nutritive value, colour, and attraction when incorporated in edible cutleries, and it doesn’t influence the functional properties. Addition of a low quantity of glycerol in edible films and colour extraction from onion peel, pumpkin, and red amaranth enhances biodegradability and provides a good quantity of nutrients when consumed. Therefore, due to its multiple advantages, food starch can serve as the best response for eco-friendly industrial products aimed to replace single use plastics at low cost.Keywords: edible films, edible cutleries, plasticizer, glycerol, starch, functional property
Procedia PDF Downloads 1854739 Implementation of a Serializer to Represent PHP Objects in the Extensible Markup Language
Authors: Lidia N. Hernández-Piña, Carlos R. Jaimez-González
Abstract:
Interoperability in distributed systems is an important feature that refers to the communication of two applications written in different programming languages. This paper presents a serializer and a de-serializer of PHP objects to and from XML, which is an independent library written in the PHP programming language. The XML generated by this serializer is independent of the programming language, and can be used by other existing Web Objects in XML (WOX) serializers and de-serializers, which allow interoperability with other object-oriented programming languages.Keywords: interoperability, PHP object serialization, PHP to XML, web objects in XML, WOX
Procedia PDF Downloads 2374738 Investigating Software Engineering Challenges in Game Development
Authors: Fawad Zaidi
Abstract:
This paper discusses a variety of challenges and solutions involved with creating computer games and the issues faced by the software engineers working in this field. This review further investigates the articles coverage of project scope and the problem of feature creep that appears to be inherent with game development. The paper tries to answer the following question: Is this a problem caused by a shortage, or bad software engineering practices, or is this outside the control of the software engineering component of the game production process?Keywords: software engineering, computer games, software applications, development
Procedia PDF Downloads 4754737 Positioning Organisational Culture in Knowledge Management Research
Authors: Said Al Saifi
Abstract:
This paper proposes a conceptual model for understanding the impact of organisational culture on knowledge management processes and their link with organisational performance. It is suggested that organisational culture should be assessed as a multi-level construct comprising artifacts, espoused beliefs and values, and underlying assumptions. A holistic view of organisational culture and knowledge management processes, and their link with organisational performance, is presented. A comprehensive review of previous literature was undertaken in the development of the conceptual model. Taken together, the literature and the proposed model reveal possible relationships between organisational culture, knowledge management processes, and organisational performance. Potential implications of organisational culture levels for the creation, sharing, and application of knowledge are elaborated. In addition, the paper offers possible new insight into the impact of organisational culture on various knowledge management processes and their link with organisational performance. A number of possible relationships between organisational culture factors, knowledge management processes, and their link with organisational performance were employed to examine such relationships. The research model highlights the multi-level components of organisational culture. These are: the artifacts, the espoused beliefs and values, and the underlying assumptions. Through a conceptualisation of the relationships between organisational culture, knowledge management processes, and organisational performance, the study provides practical guidance for practitioners during the implementation of knowledge management processes. The focus of previous research on knowledge management has been on understanding organisational culture from the limited perspective of promoting knowledge creation and sharing. This paper proposes a more comprehensive approach to understanding organisational culture in that it draws on artifacts, espoused beliefs and values, and underlying assumptions, and reveals their impact on the creation, sharing, and application of knowledge which can affect overall organisational performance.Keywords: knowledge application, knowledge creation, knowledge management, knowledge sharing, organisational culture, organisational performance
Procedia PDF Downloads 5764736 Optimum Design of Support and Care Home for the Elderly
Authors: P. Shahabi
Abstract:
The increase in average human life expectancy has led to a growing elderly population. This demographic shift has brought forth various challenges related to the mental and physical well-being of the elderly, often resulting in a lack of dignity and respect for this valuable segment of society. These emerging social issues have cast a shadow on the lives of families, prompting the need for innovative solutions to enhance the lives of the elderly. In this study, within the context of architecture, we aim to create a pleasant and nurturing environment that combines traditional Iranian and modern architectural elements to cater to the unique needs of the elderly. Our primary research objectives encompass the following: Recognizing the societal demand for nursing homes due to the increasing elderly population, addressing the need for a conducive environment that promotes physical and mental well-being among the elderly, developing spatial designs that are specifically tailored to the elderly population, ensuring their comfort and convenience. To achieve these objectives, we have undertaken a comprehensive exploration of the challenges and issues faced by the elderly. We have also laid the groundwork for the architectural design of nursing homes, culminating in the presentation of an architectural plan aimed at minimizing the difficulties faced by the elderly and enhancing their quality of life. It is noteworthy that many of the existing nursing homes in Iran lack the necessary welfare and safety conditions required for the elderly. Hence, our research aims to establish comprehensive and suitable criteria for the optimal design of nursing homes. We believe that through optimal design, we can create spaces that are not only diverse, attractive, and dynamic but also significantly improve the quality of life for the elderly. We hold the hope that these homes will serve as beacons of hope and tranquility for all individuals in their later years.Keywords: care home, elderly, optimum design, support
Procedia PDF Downloads 774735 Wet Processing of Algae for Protein and Carbohydrate Recovery as Co-Product of Algal Oil
Authors: Sahil Kumar, Rajaram Ghadge, Ramesh Bhujade
Abstract:
Historically, lipid extraction from dried algal biomass remained a focus area of the algal research. It has been realized over the past few years that the lipid-centric approach and conversion technologies that require dry algal biomass have several challenges. Algal culture in cultivation systems contains more than 99% water, with algal concentrations of just a few hundred milligrams per liter ( < 0.05 wt%), which makes harvesting and drying energy intensive. Drying the algal biomass followed by extraction also entails the loss of water and nutrients. In view of these challenges, focus has shifted toward developing processes that will enable oil production from wet algal biomass without drying. Hydrothermal liquefaction (HTL), an emerging technology, is a thermo-chemical conversion process that converts wet biomass to oil and gas using water as a solvent at high temperature and high pressure. HTL processes wet algal slurry containing more than 80% water and significantly reduces the adverse cost impact owing to drying the algal biomass. HTL, being inherently feedstock agnostic, i.e., can convert carbohydrates and proteins also to fuels and recovers water and nutrients. It is most effective with low-lipid (10--30%) algal biomass, and bio-crude yield is two to four times higher than the lipid content in the feedstock. In the early 2010s, research remained focused on increasing the oil yield by optimizing the process conditions of HTL. However, various techno-economic studies showed that simply converting algal biomass to only oil does not make economic sense, particularly in view of low crude oil prices. Making the best use of every component of algae is a key for economic viability of algal to oil process. On investigation of HTL reactions at the molecular level, it has been observed that sequential HTL has the potential to recover value-added products along with biocrude and improve the overall economics of the process. This potential of sequential HTL makes it a most promising technology for converting wet waste to wealth. In this presentation, we will share our experience on the techno-economic and engineering aspects of sequential HTL for conversion of algal biomass to algal bio-oil and co-products.Keywords: algae, biomass, lipid, protein
Procedia PDF Downloads 2144734 Fermented Fruit and Vegetable Discard as a Source of Feeding Ingredients and Functional Additives
Authors: Jone Ibarruri, Mikel Manso, Marta Cebrián
Abstract:
A high amount of food is lost or discarded in the World every year. In addition, in the last decades, an increasing demand of new alternative and sustainable sources of proteins and other valuable compounds is being observed in the food and feeding sectors and, therefore, the use of food by-products as nutrients for these purposes sounds very interesting from the environmental and economical point of view. However, the direct use of discarded fruit and vegetables that present, in general, a low protein content is not interesting as feeding ingredient except if they are used as a source of fiber for ruminants. Especially in the case of aquaculture, several alternatives to the use of fish meal and other vegetable protein sources have been extensively explored due to the scarcity of fish stocks and the unsustainability of fishing for these purposes. Fish mortality is also of great concern in this sector as this problem highly reduces their economic feasibility. So, the development of new functional and natural ingredients that could reduce the need for vaccination is also of great interest. In this work, several fermentation tests were developed at lab scale using a selected mixture of fruit and vegetable discards from a wholesale market located in the Basque Country to increase their protein content and also to produce some bioactive extracts that could be used as additives in aquaculture. Fruit and vegetable mixtures (60/40 ww) were centrifugated for humidity reduction and crushed to 2-5 mm particle size. Samples were inoculated with a selected Rhizopus oryzae strain and fermented for 7 days in controlled conditions (humidity between 65 and 75% and 28ºC) in Petri plates (120 mm) by triplicate. Obtained results indicated that the final fermented product presented a twofold protein content (from 13 to 28% d.w). Fermented product was further processed to determine their possible functionality as a feed additive. Extraction tests were carried out to obtain an ethanolic extract (60:40 ethanol: water, v.v) and remaining biomass that also could present applications in food or feed sectors. The extract presented a polyphenol content of about 27 mg GAE/gr d.w with antioxidant activity of 8.4 mg TEAC/g d.w. Remining biomass is mainly composed of fiber (51%), protein (24%) and fat (10%). Extracts also presented antibacterial activity according to the results obtained in Agar Diffusion and to the Minimum Inhibitory Concentration (MIC) tests determined against several food and fish pathogen strains. In vitro, digestibility was also assessed to obtain preliminary information about the expected effect of extraction procedure on fermented product digestibility. First results indicated that remaining biomass after extraction doesn´t seem to improve digestibility in comparison to the initial fermented product. These preliminary results show that fermented fruit and vegetables can be a useful source of functional ingredients for aquaculture applications and a substitute of other protein sources in the feeding sector. Further validation will be also carried out through “in vivo” tests with trout and bass.Keywords: fungal solid state fermentation, protein increase, functional extracts, feed ingredients
Procedia PDF Downloads 644733 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples
Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes
Abstract:
One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.Keywords: airport ontology, knowledge management, ontology modeling, reasoning
Procedia PDF Downloads 5374732 A Comprehensive Approach to Sustainable Building Design: Bridging Design for Adaptability and Circular Economy with LCA
Authors: Saba Baienat, Ivanka Iordanova, Bechara Helal
Abstract:
Incorporating the principles of Design for Adaptability (DfAd) and Circular Economy (CE) into the service life planning of buildings and construction engineering projects can significantly enhance sustainable development. By employing DfAd, both the service life and design process can be optimized, gradually postponing the building’s End of Life (EoL) and extending the service life of buildings, thereby closing material cycles and making them more circular. This paper presents a comprehensive framework that addresses adaptability strategies and considerations to objectively assess the role of DfAd in circularity. The framework aims to provide a streamlined approach for accessing DfAd strategies and identifying the most effective ones for enhancing a project's adaptability. Key strategies include anticipating changes in requirements, enabling adaptations and transformations of the building for better use and reuse, preparing for future lives of the building and its components, and contributing to the circular material life cycle. Furthermore, the framework seeks to enhance the awareness of stakeholders about the subject of Design for Adaptability through the lens of the Circular Economy. Additionally, this paper integrates Life Cycle Assessment (LCA) methodologies to evaluate the environmental impacts of implementing DfAd strategies within the context of the Circular Economy. By utilizing LCA, the framework provides a quantitative basis for assessing the sustainability benefits of adaptable building designs, offering insights into how these strategies can minimize resource consumption, reduce emissions, and enhance overall environmental performance. This holistic approach underscores the critical role of LCA in bridging DfAd and CE, ultimately fostering more resilient and sustainable construction practices.Keywords: circular economy (CE), design for adaptability (DfAd), life cycle assessment (LCA), sustainable development
Procedia PDF Downloads 334731 Challenges & Barriers for Neuro Rehabilitation in Developing Countries
Authors: Muhammad Naveed Babur, Maria Liaqat
Abstract:
Background & Objective: People with disabilities especially neurological disabilities have many unmet health and rehabilitation needs, face barriers in accessing mainstream health-care services, and consequently have poor health. There are not sufficient epidemiological studies from Pakistan which assess barriers to neurorehabilitation and ways to counter it. Objectives: The objective of the study was to determine the challenges and to evaluate the barriers for neuro-rehabilitation services in developing countries. Methods: This is Exploratory sequential qualitative study based on the Panel discussion forum in International rehabilitation sciences congress and national rehabilitation conference 2017. Panel group discussion has been conducted in February 2017 with a sample size of eight professionals including Rehabilitation medicine Physician, Physical Therapist, Speech Language therapist, Occupational Therapist, Clinical Psychologist and rehabilitation nurse working in multidisciplinary/Interdisciplinary team. A comprehensive audio-videography have been developed, recorded, transcripted and documented. Data was transcribed and thematic analysis along with characteristics was drawn manually. Data verification was done with the help of two separate coders. Results: After extraction of two separate coders following results are emerged. General category themes are disease profile, demographic profile, training and education, research, barriers, governance, global funding, informal care, resources and cultural beliefs and public awareness. Barriers identified at the level are high cost, stigma, lengthy course of recovery. Hospital related barriers are lack of social support and individually tailored goal setting processes. Organizational barriers identified are lack of basic diagnostic facilities, lack of funding and human resources. Recommendations given by panelists were investment in education, capacity building, infrastructure, governance support, strategies to promote communication and realistic goals. Conclusion: It is concluded that neurorehabilitation in developing countries need attention in following categories i.e. disease profile, demographic profile, training and education, research, barriers, governance, global funding, informal care, resources and cultural beliefs and public awareness. This study also revealed barriers at the level of patient, hospital, organization. Recommendations were also given by panelists.Keywords: disability, neurorehabilitation, telerehabilitation, disability
Procedia PDF Downloads 1914730 Development of an Appropriate Method for the Determination of Multiple Mycotoxins in Pork Processing Products by UHPLC-TCFLD
Authors: Jason Gica, Yi-Hsieng Samuel Wu, Deng-Jye Yang, Yi-Chen Chen
Abstract:
Mycotoxins, harmful secondary metabolites produced by certain fungi species, pose significant risks to animals and humans worldwide. Their stable properties lead to contamination during grain harvesting, transportation, and storage, as well as in processed food products. The prevalence of mycotoxin contamination has attracted significant attention due to its adverse impact on food safety and global trade. The secondary contamination pathway from animal products has been identified as an important route of exposure, posing health risks for livestock and humans consuming contaminated products. Pork, one of the highly consumed meat products in Taiwan according to the National Food Consumption Database, plays a critical role in the nation's diet and economy. Given its substantial consumption, pork processing products are a significant component of the food supply chain and a potential source of mycotoxin contamination. This study is paramount for formulating effective regulations and strategies to mitigate mycotoxin-related risks in the food supply chain. By establishing a reliable analytical method, this research contributes to safeguarding public health and enhancing the quality of pork processing products. The findings will serve as valuable guidance for policymakers, food industries, and consumers to ensure a safer food supply chain in the face of emerging mycotoxin challenges. An innovative and efficient analytical approach is proposed using Ultra-High Performance Liquid Chromatography coupled with Temperature Control Fluorescence Detector Light (UHPLC-TCFLD) to determine multiple mycotoxins in pork meat samples due to its exceptional capacity to detect multiple mycotoxins at the lowest levels of concentration, making it highly sensitive and reliable for comprehensive mycotoxin analysis. Additionally, its ability to simultaneously detect multiple mycotoxins in a single run significantly reduces the time and resources required for analysis, making it a cost-effective solution for monitoring mycotoxin contamination in pork processing products. The research aims to optimize the efficient mycotoxin QuEChERs extraction method and rigorously validate its accuracy and precision. The results will provide crucial insights into mycotoxin levels in pork processing products.Keywords: multiple-mycotoxin analysis, pork processing products, QuEChERs, UHPLC-TCFLD, validation
Procedia PDF Downloads 694729 Using Equipment Telemetry Data for Condition-Based maintenance decisions
Authors: John Q. Todd
Abstract:
Given that modern equipment can provide comprehensive health, status, and error condition data via built-in sensors, maintenance organizations have a new and valuable source of insight to take advantage of. This presentation will expose what these data payloads might look like and how they can be filtered, visualized, calculated into metrics, used for machine learning, and generate alerts for further action.Keywords: condition based maintenance, equipment data, metrics, alerts
Procedia PDF Downloads 1884728 Survey on Awareness, Knowledge and Practices: Managing Osteoporosis among Practitioners in a Tertiary Hospital, Malaysia
Authors: P. H. Tee, S. M. Zamri, K. M. Kasim, S. K. Tiew
Abstract:
This study evaluates the management of osteoporosis in a tertiary care government hospital in Malaysia. As the number of admitted patients having osteoporotic fractures is on the rise, osteoporotic medications are an increasing financial burden to government hospitals because they account for half of the orthopedic budget and expenditure. Comprehensive knowledge among practitioners is important to detect early and avoid this preventable disease and its serious complications. The purpose of this study is to evaluate the awareness, knowledge, and practices in managing osteoporosis among practitioners in Hospital Tengku Ampuan Rahimah (HTAR), Klang. A questionnaire from an overseas study in managing osteoporosis among primary care physicians is adapted to Malaysia’s Clinical Practice Guideline of Osteoporosis 2012 (revised 2015) and international guidelines were distributed to all orthopedic practitioners in HTAR Klang (including surgeons, orthopedic medical officers), endocrinologists, rheumatologists and geriatricians. The participants were evaluated on their expertise in the diagnosis, prevention, treatment decision and medications for osteoporosis. Collected data were analyzed for all descriptive and statistical analyses as appropriate. All 45 participants responded to the questionnaire. Participants scored highest on expertise in prevention, followed by diagnosis, treatment decision and lastly, medication. Most practitioners stated that own-initiated continuing professional education from articles and books was the most effective way to update their knowledge, followed by attendance in conferences on osteoporosis. This study confirms the importance of comprehensive training and education regarding osteoporosis among tertiary care physicians and surgeons, predominantly in pharmacotherapy, to deliver wholesome care for osteoporotic patients.Keywords: awareness, knowledge, osteoporosis, practices
Procedia PDF Downloads 130