Search results for: Data quality
26619 A Human Activity Recognition System Based on Sensory Data Related to Object Usage
Authors: M. Abdullah, Al-Wadud
Abstract:
Sensor-based activity recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.Keywords: Naïve Bayesian, based classification, activity recognition, sensor data, object-usage model
Procedia PDF Downloads 32226618 Use of Polymeric Materials in the Architectural Preservation
Authors: F. Z. Benabid, F. Zouai, A. Douibi, D. Benachour
Abstract:
These Fluorinated polymers and polyacrylics have known a wide use in the field of historical monuments. PVDF provides a great easiness to processing, a good UV resistance and good chemical inertia. Although the quality of physical characteristics of the PMMA and its low price with a respect to PVDF, its deterioration against UV radiations limits its use as protector agent for the stones. On the other hand, PVDF/PMMA blend is a compromise of a great development in the field of architectural restoration, since it is the best method in term of quality and price to make new polymeric materials having enhanced properties. Films of different compositions based on the two polymers within an adequate solvent (DMF) were obtained to perform an exposition to artificial ageing and to the salted fog, a spectroscopic analysis (FTIR and UV) and optical analysis (refractive index). Based on its great interest in the field of building, a variety of standard tests has been elaborated for the first time at the central laboratory of ENAP (Souk-Ahras) in order to evaluate our blend performance. The obtained results have allowed observing the behavior of the different compositions of the blend under various tests. The addition of PVDF to PMMA enhances the properties of this last to know the exhibition to the natural and artificial ageing and to the saline fog. On the other hand, PMMA enhances the optical properties of the blend. Finally, 70/30 composition of the blend is in concordance with results of previous works and it is the adequate proportion for an eventual application.Keywords: blend, PVDF, PMMA, preservation, historic monuments
Procedia PDF Downloads 30926617 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field
Authors: Nastaran Moosavi, Mohammad Mokhtari
Abstract:
Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.Keywords: density, p-impedance, s-impedance, post-stack seismic inversion, pre-stack seismic inversion
Procedia PDF Downloads 32426616 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors
Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui
Abstract:
Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.Keywords: data-driven method, process control, anomaly detection, dimensionality reduction
Procedia PDF Downloads 29926615 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects
Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town
Abstract:
The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry
Procedia PDF Downloads 9226614 Material and Parameter Analysis of the PolyJet Process for Mold Making Using Design of Experiments
Authors: A. Kampker, K. Kreisköther, C. Reinders
Abstract:
Since additive manufacturing technologies constantly advance, the use of this technology in mold making seems reasonable. Many manufacturers of additive manufacturing machines, however, do not offer any suggestions on how to parameterize the machine to achieve optimal results for mold making. The purpose of this research is to determine the interdependencies of different materials and parameters within the PolyJet process by using design of experiments (DoE), to additively manufacture molds, e.g. for thermoforming and injection molding applications. Therefore, the general requirements of thermoforming molds, such as heat resistance, surface quality and hardness, have been identified. Then, different materials and parameters of the PolyJet process, such as the orientation of the printed part, the layer thickness, the printing mode (matte or glossy), the distance between printed parts and the scaling of parts, have been examined. The multifactorial analysis covers the following properties of the printed samples: Tensile strength, tensile modulus, bending strength, elongation at break, surface quality, heat deflection temperature and surface hardness. The key objective of this research is that by joining the results from the DoE with the requirements of the mold making, optimal and tailored molds can be additively manufactured with the PolyJet process. These additively manufactured molds can then be used in prototyping processes, in process testing and in small to medium batch production.Keywords: additive manufacturing, design of experiments, mold making, PolyJet, 3D-Printing
Procedia PDF Downloads 25526613 Pharmacovigilance: An Empowerment in Safe Utilization of Pharmaceuticals
Authors: Pankaj Prashar, Bimlesh Kumar, Ankita Sood, Anamika Gautam
Abstract:
Pharmacovigilance (PV) is a rapidly growing discipline in pharmaceutical industries as an integral part of clinical research and drug development over the past few decades. PV carries a breadth of scope from drug manufacturing to its regulation with safer utilization. The fundamental steps of PV not only includes data collection and verification, coding of drugs with adverse drug reactions, causality assessment and timely reporting to the authorities but also monitoring drug manufacturing, safety issues, product quality and conduction of due diligence. Standardization of adverse event information, collaboration of multiple departments in different companies, preparation of documents in accordance to both governmental as well as non-governmental organizations (FDA, EMA, GVP, ICH) are the advancements in discipline of PV. De-harmonization, lack of predictive drug safety models, improper funding by government, non-reporting, and non-acceptability of ADRs by developing countries and reports directly from patients to the monitoring centres respectively are the major road backs of PV. Mandatory pharmacovigilance reporting, frequent inspections, funding by government, educating and training medical students, pharmacists and nurses in this segment can bring about empowerment in PV. This area needs to be addressed with a sense of urgency for the safe utilization of pharmaceuticals.Keywords: pharmacovigilance, regulatory, adverse event, drug safety
Procedia PDF Downloads 12426612 Nanoparticles in Drug Delivery and Therapy of Alzeheimer's Disease
Authors: Nirupama Dixit, Anyaa Mittal, Neeru Sood
Abstract:
Alzheimer’s disease (AD) is a progressive form of dementia, contributing to up to 70% of cases, mostly observed in elderly but is not restricted to old age. The pathophysiology of the disease is characterized by specific pathological changes in brain. The changes (i.e. accumulation of metal ions in brain, formation of extracellular β-amyloid (Aβ) peptide aggregates and tangle of hyper phosphorylated Tau protein inside neurons) damage the neuronal connections irreversibly. The current issues in improvement of life quality of Alzheimer's patient lies in the fact that the diagnosis is made at a late stage of the disease and the medications do not treat the basic causes of Alzheimer's. The targeted delivery of drug through the blood brain barrier (BBB) poses several limitations via traditional approaches for treatment. To overcome these drug delivery limitation, nanoparticles provide a promising solution. This review focuses on current strategies for efficient targeted drug delivery using nanoparticles and improving the quality of therapy provided to the patient. Nanoparticles can be used to encapsulate drug (which is generally hydrophobic) to ensure its passage to brain; they can be conjugated to metal ion chelators to reduce the metal load in neural tissue thus lowering the harmful effects of oxidative damage; can be conjugated with drug and monoclonal antibodies against BBB endogenous receptors. Finally this review covers how the nanoparticles can play a role in diagnosing the disease.Keywords: Alzheimer's disease, β-amyloid plaques, blood brain barrier, metal chelators, nanoparticles
Procedia PDF Downloads 49026611 An Investigation of E-Government by Using GIS and Establishing E-Government in Developing Countries Case Study: Iraq
Authors: Ahmed M. Jamel
Abstract:
Electronic government initiatives and public participation to them are among the indicators of today's development criteria of the countries. After consequent two wars, Iraq's current position in, for example, UN's e-government ranking is quite concerning and did not improve in recent years, either. In the preparation of this work, we are motivated with the fact that handling geographic data of the public facilities and resources are needed in most of the e-government projects. Geographical information systems (GIS) provide most common tools not only to manage spatial data but also to integrate such type of data with nonspatial attributes of the features. With this background, this paper proposes that establishing a working GIS in the health sector of Iraq would improve e-government applications. As the case study, investigating hospital locations in Erbil is chosen.Keywords: e-government, GIS, Iraq, Erbil
Procedia PDF Downloads 38926610 Evaluation of Classification Algorithms for Diagnosis of Asthma in Iranian Patients
Authors: Taha SamadSoltani, Peyman Rezaei Hachesu, Marjan GhaziSaeedi, Maryam Zolnoori
Abstract:
Introduction: Data mining defined as a process to find patterns and relationships along data in the database to build predictive models. Application of data mining extended in vast sectors such as the healthcare services. Medical data mining aims to solve real-world problems in the diagnosis and treatment of diseases. This method applies various techniques and algorithms which have different accuracy and precision. The purpose of this study was to apply knowledge discovery and data mining techniques for the diagnosis of asthma based on patient symptoms and history. Method: Data mining includes several steps and decisions should be made by the user which starts by creation of an understanding of the scope and application of previous knowledge in this area and identifying KD process from the point of view of the stakeholders and finished by acting on discovered knowledge using knowledge conducting, integrating knowledge with other systems and knowledge documenting and reporting.in this study a stepwise methodology followed to achieve a logical outcome. Results: Sensitivity, Specifity and Accuracy of KNN, SVM, Naïve bayes, NN, Classification tree and CN2 algorithms and related similar studies was evaluated and ROC curves were plotted to show the performance of the system. Conclusion: The results show that we can accurately diagnose asthma, approximately ninety percent, based on the demographical and clinical data. The study also showed that the methods based on pattern discovery and data mining have a higher sensitivity compared to expert and knowledge-based systems. On the other hand, medical guidelines and evidence-based medicine should be base of diagnostics methods, therefore recommended to machine learning algorithms used in combination with knowledge-based algorithms.Keywords: asthma, datamining, classification, machine learning
Procedia PDF Downloads 44726609 Decision Support System in Air Pollution Using Data Mining
Authors: E. Fathallahi Aghdam, V. Hosseini
Abstract:
Environmental pollution is not limited to a specific region or country; that is why sustainable development, as a necessary process for improvement, pays attention to issues such as destruction of natural resources, degradation of biological system, global pollution, and climate change in the world, especially in the developing countries. According to the World Health Organization, as a developing city, Tehran (capital of Iran) is one of the most polluted cities in the world in terms of air pollution. In this study, three pollutants including particulate matter less than 10 microns, nitrogen oxides, and sulfur dioxide were evaluated in Tehran using data mining techniques and through Crisp approach. The data from 21 air pollution measuring stations in different areas of Tehran were collected from 1999 to 2013. Commercial softwares Clementine was selected for this study. Tehran was divided into distinct clusters in terms of the mentioned pollutants using the software. As a data mining technique, clustering is usually used as a prologue for other analyses, therefore, the similarity of clusters was evaluated in this study through analyzing local conditions, traffic behavior, and industrial activities. In fact, the results of this research can support decision-making system, help managers improve the performance and decision making, and assist in urban studies.Keywords: data mining, clustering, air pollution, crisp approach
Procedia PDF Downloads 42726608 A Study of Heavy Hydrocarbons Upgrading by Microwave Pyrolysis
Authors: Thanida Sritangthong, Suksun Amornraksa
Abstract:
By-product upgrading is crucial in hydrocarbon industries as it can increase overall profit margin of the business. Microwave-assisted pyrolysis is relatively new technique which induces heat directly to raw materials. This results in a more energy saving and more energy-efficient process. It is also a promising method to enhance and accelerate chemical reactions, thus reducing the pyrolysis reaction time and increasing the quality of value-added products from different kinds of feedstocks. In this study, upgrading opportunity of fuel oil by-product from an olefins plant is investigated by means of microwave pyrolysis. The experiment was conducted in a lab-scale quartz reactor placed inside a 1,100 watts household microwave oven. Operating temperature was varied from 500 to 900C to observe the consequence on the quality of pyrolysis products. Several microwave receptors i.e. activated carbon, silicon carbide (SiC) and copper oxide (CuO) were used as a material to enhance the heating and reaction in the reactor. The effect of residence time was determined by adjusting flow rate of N2 carrier gas. The chemical composition and product yield were analyzed by using gas chromatography (GC) and gas chromatography/mass spectrometry (GC/MS). The results showed that hydrogen, methane, ethylene, and ethane were obtained as the main gaseous products from all operating temperatures while the main liquid products were alkane, cycloalkane and polycyclic aromatic groups. The results indicated that microwave pyrolysis has a potential to upgrade low value hydrocarbons to high value products.Keywords: fuel oil, heavy hydrocarbons, microwave pyrolysis, pyrolysis
Procedia PDF Downloads 31926607 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm
Authors: Anuradha Chug, Sunali Gandhi
Abstract:
Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm
Procedia PDF Downloads 38126606 Prediction of Physical Properties and Sound Absorption Performance of Automotive Interior Materials
Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Seong-Jin Cho, Tae-Hyeon Oh, Dae-Kyu Park
Abstract:
Sound absorption coefficient is considered important when designing because noise affects emotion quality of car. It is designed with lots of experiment tunings in the field because it is unreliable to predict it for multi-layer material. In this paper, we present the design of sound absorption for automotive interior material with multiple layers using estimation software of sound absorption coefficient for reverberation chamber. Additionally, we introduce the method for estimation of physical properties required to predict sound absorption coefficient of car interior materials with multiple layers too. It is calculated by inverse algorithm. It is very economical to get information about physical properties without expensive equipment. Correlation test is carried out to ensure reliability for accuracy. The data to be used for the correlation is sound absorption coefficient measured in the reverberation chamber. In this way, it is considered economical and efficient to design automotive interior materials. And design optimization for sound absorption coefficient is also easy to implement when it is designed.Keywords: sound absorption coefficient, optimization design, inverse algorithm, automotive interior material, multiple layers nonwoven, scaled reverberation chamber, sound impedance tubes
Procedia PDF Downloads 30926605 A Retrospective Study of Vaginal Stenosis Following Treatment of Cervical Cancers and the Effectiveness of Rehabilitation Interventions
Authors: Manjusha R. Vagal, Shyam K. Shrivastava, Umesh Mahantshetty, Sudeep Gupta, Supriya Chopra, Reena Engineer, Amita Maheshwari, Atul Buduk
Abstract:
Vaginal stenosis is a common side effect associated with pelvic radiotherapy in cervical cancer patients which contributes negatively to woman’s health and prevents adequate vaginal/cervical examination. Vaginal dilation with a dilator is routine practice and is internationally advocated as a prophylactic measure to preserve vaginal patency. This retrospective study was carried out with the aim to know the usefulness of vaginal dilation following pelvic radiation therapy in cervical cancer patients in India. Data from medical records of 183 cervical cancer patients, which met the study criteria, were collected related to the stage of the disease, treatment received, commencement period of dilation post radiation therapy, sexual status and side effects associated to dilation practice. Data related to vaginal dimensions as per the length of insertion of a small, medium and large dilator were collected on regular follow-ups until 36 months and/or more. Vaginal dimensions as measured with the length of medium dilator insertion were used for analysis of dilation therapy results using paired t-test. Patients who underwent vaginal dilation with dilator maintained vaginal patency, also the mean vaginal length significantly increased, from 8.02 cm ± 2.69 to 9.96 ± 2.89 cm with a p value <0.001. There was no significant difference found on vaginal patency with different intervals of initiation of dilation therapy. At the third year and more following dilation therapy, significant increase in vaginal length observed with a p value of 0.0001 in both sexually active and inactive patients. Compilation of vaginal dosage during brachytherapy was inadequate, and hence, the secondary objective of the study to determine the effect of radiotherapy on the outcome of rehabilitation intervention was not studied in detail. This retrospective study has found that dilation therapy with vaginal dilators post pelvic radiotherapy is effective in preventing vaginal stenosis and improving vaginal patency and cannot be substituted with vaginal intercourse. Sexual quality of life assessment in the Indian population needs much attention.Keywords: dilator, sexually active, vaginal dilation, vaginal stenosis
Procedia PDF Downloads 20126604 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization
Authors: Agria Rhamdhan
Abstract:
WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.Keywords: forensics, triage, visualization, WhatsApp
Procedia PDF Downloads 16826603 Low Cost Webcam Camera and GNSS Integration for Updating Home Data Using AI Principles
Authors: Mohkammad Nur Cahyadi, Hepi Hapsari Handayani, Agus Budi Raharjo, Ronny Mardianto, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
PDAM (local water company) determines customer charges by considering the customer's building or house. Charges determination significantly affects PDAM income and customer costs because the PDAM applies a subsidy policy for customers classified as small households. Periodic updates are needed so that pricing is in line with the target. A thorough customer survey in Surabaya is needed to update customer building data. However, the survey that has been carried out so far has been by deploying officers to conduct one-by-one surveys for each PDAM customer. Surveys with this method require a lot of effort and cost. For this reason, this research offers a technology called moblie mapping, a mapping method that is more efficient in terms of time and cost. The use of this tool is also quite simple, where the device will be installed in the car so that it can record the surrounding buildings while the car is running. Mobile mapping technology generally uses lidar sensors equipped with GNSS, but this technology requires high costs. In overcoming this problem, this research develops low-cost mobile mapping technology using a webcam camera sensor added to the GNSS and IMU sensors. The camera used has specifications of 3MP with a resolution of 720 and a diagonal field of view of 78⁰. The principle of this invention is to integrate four camera sensors, a GNSS webcam, and GPS to acquire photo data, which is equipped with location data (latitude, longitude) and IMU (roll, pitch, yaw). This device is also equipped with a tripod and a vacuum cleaner to attach to the car's roof so it doesn't fall off while running. The output data from this technology will be analyzed with artificial intelligence to reduce similar data (Cosine Similarity) and then classify building types. Data reduction is used to eliminate similar data and maintain the image that displays the complete house so that it can be processed for later classification of buildings. The AI method used is transfer learning by utilizing a trained model named VGG-16. From the analysis of similarity data, it was found that the data reduction reached 50%. Then georeferencing is done using the Google Maps API to get address information according to the coordinates in the data. After that, geographic join is done to link survey data with customer data already owned by PDAM Surya Sembada Surabaya.Keywords: mobile mapping, GNSS, IMU, similarity, classification
Procedia PDF Downloads 8426602 Patient Perspectives on Telehealth During the Pandemic in the United States
Authors: Manal Sultan Alhussein, Xiang Michelle Liu
Abstract:
Telehealth is an advanced technology using digital information and telecommunication facilities that provide access to health services from a distance. It slows the transmission factor of COVID-19, especially for elderly patients and patients with chronic diseases during the pandemic. Therefore, understanding patient perspectives on telehealth services and the factors impacting their option of telehealth service will shed light on the measures that healthcare providers can take to improve the quality of telehealth services. This study aimed to evaluate perceptions of telehealth services among different patient groups and explore various aspects of telehealth utilization in the United States during the COVID-19 pandemic. An online survey distributed via social media platforms was used to collect research data. In addition to the descriptive statistics, both correlation and regression analyses were conducted to test research hypotheses. The empirical results highlighted that the factors such as accessibility to telehealth services and the type of specialty clinics that the patients required play important roles in the effectiveness of telehealth services they received. However, the results found that patients’ waiting time to receive telehealth services and their annual income did not significantly influence their desire to select receiving healthcare services via telehealth. The limitations of the study and future research directions are discussed.Keywords: telehealth, patient satisfaction, pandemic, healthcare, survey
Procedia PDF Downloads 11226601 An Investigation into the Views of Distant Science Education Students Regarding Teaching Laboratory Work Online
Authors: Abraham Motlhabane
Abstract:
This research analysed the written views of science education students regarding the teaching of laboratory work using the online mode. The research adopted the qualitative methodology. The qualitative research was aimed at investigating small and distinct groups normally regarded as a single-site study. Qualitative research was used to describe and analyze the phenomena from the student’s perspective. This means the research began with assumptions of the world view that use theoretical lenses of research problems inquiring into the meaning of individual students. The research was conducted with three groups of students studying for Postgraduate Certificate in Education, Bachelor of Education and honors Bachelor of Education respectively. In each of the study programmes, the science education module is compulsory. Five science education students from each study programme were purposively selected to participate in this research. Therefore, 15 students participated in the research. In order to analysis the data, the data were first printed and hard copies were used in the analysis. The data was read several times and key concepts and ideas were highlighted. Themes and patterns were identified to describe the data. Coding as a process of organising and sorting data was used. The findings of the study are very diverse; some students are in favour of online laboratory whereas other students argue that science can only be learnt through hands-on experimentation.Keywords: online learning, laboratory work, views, perceptions
Procedia PDF Downloads 14526600 The Levels of Neurosteroid 7β-Hydroxy-Epiandrosterone in Men and Pregnant Women
Authors: J. Vitku, L. Kolatorova, T. Chlupacova, J. Heracek, M. Hill, M. Duskova, L. Starka
Abstract:
Background: 7β-hydroxy-epiandrosterone (7β–OH-EpiA) is an endogenous steroid, that has been shown to exert neuroprotective and anti-inflammatory effects in vitro as well as in animal models. However, to the best of our knowledge no information is available about concentration of this androgen metabolite in human population. The aim of the study was to measure and compare levels of 7β–OH-EpiA in men and pregnant women in different biological fluids and evaluate the relationship between 7β–OH-EpiA in men and their sperm quality. Methods: First, a sensitive isotope dilution high performance liquid chromatography-mass spectrometry method for measurement of 7β–OH-EpiA in different biological fluids was developed. Validation of the method met the requirements of FDA guidelines. Afterwards 7β–OH-EpiA in plasma and seminal plasma of 191 men with different degree of infertility (healthy men, lightly infertile men, moderately infertile men, severely infertile men) was analysed. Furthermore, the levels of 7β–OH-EpiA in plasma of 34 pregnant women in 37th week of gestation and corresponding cord plasma that reflects steroid levels in the fetus were measured. Results: Concentrations of 7β–OH-EpiA in seminal plasma were significantly higher in severely infertile men in comparison with healthy men and lightly infertile men. The same trend was observed when blood plasma was evaluated. Furthermore, plasmatic 7β –OH-EpiA negatively correlated with concentration (-0.215; p < 0.01) and total count (-0.15; p < 0.05). Seminal 7β–OH-EpiA was negatively associated with motility (-0.26; p < 0.01), progressively motile sperms (-0.233; p < 0.01) and nonprogressively motile sperms (-0.188; p < 0.05). Plasmatic 7β –OH-EpiA levels in men were generally higher in comparison with pregnant women. Levels 7β–OH-EpiA were under the lower limit of quantification (LLOQ) in majority of samples of pregnant women and cord plasma. Only 4 plasma samples of pregnant women and 7 cord blood plasma samples were above LLOQ and where in range of units of pg/ml. Conclusion: Based on available information, this is the first study measuring 7β–OH-EpiA in human samples. 7β–OH-EpiA is associated with lower sperm quality and certainly it is worth to explore its role in this field thoroughly. Interestingly, levels of 7β–OH-EpiA in pregnant women were extremely low despite the fact that steroid levels including androgens are generally higher during pregnancy. Acknowledgements: This work was supported by the project MH CR 17-30528 A from the Czech Health Research Council, MH CZ - DRO (Institute of Endocrinology - EU, 00023761) and by the MEYS CR (OP RDE, Excellent research - ENDO.CZ).Keywords: 7β-hydroxy-epiandrosterone, steroid, sperm quality, pregnancy
Procedia PDF Downloads 25626599 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP
Procedia PDF Downloads 31626598 Mining Scientific Literature to Discover Potential Research Data Sources: An Exploratory Study in the Field of Haemato-Oncology
Authors: A. Anastasiou, K. S. Tingay
Abstract:
Background: Discovering suitable datasets is an important part of health research, particularly for projects working with clinical data from patients organized in cohorts (cohort data), but with the proliferation of so many national and international initiatives, it is becoming increasingly difficult for research teams to locate real world datasets that are most relevant to their project objectives. We present a method for identifying healthcare institutes in the European Union (EU) which may hold haemato-oncology (HO) data. A key enabler of this research was the bibInsight platform, a scientometric data management and analysis system developed by the authors at Swansea University. Method: A PubMed search was conducted using HO clinical terms taken from previous work. The resulting XML file was processed using the bibInsight platform, linking affiliations to the Global Research Identifier Database (GRID). GRID is an international, standardized list of institutions, including the city and country in which the institution exists, as well as a category of the main business type, e.g., Academic, Healthcare, Government, Company. Countries were limited to the 28 current EU members, and institute type to 'Healthcare'. An article was considered valid if at least one author was affiliated with an EU-based healthcare institute. Results: The PubMed search produced 21,310 articles, consisting of 9,885 distinct affiliations with correspondence in GRID. Of these articles, 760 were from EU countries, and 390 of these were healthcare institutes. One affiliation was excluded as being a veterinary hospital. Two EU countries did not have any publications in our analysis dataset. The results were analysed by country and by individual healthcare institute. Networks both within the EU and internationally show institutional collaborations, which may suggest a willingness to share data for research purposes. Geographical mapping can ensure that data has broad population coverage. Collaborations with industry or government may exclude healthcare institutes that may have embargos or additional costs associated with data access. Conclusions: Data reuse is becoming increasingly important both for ensuring the validity of results, and economy of available resources. The ability to identify potential, specific data sources from over twenty thousand articles in less than an hour could assist in improving knowledge of, and access to, data sources. As our method has not yet specified if these healthcare institutes are holding data, or merely publishing on that topic, future work will involve text mining of data-specific concordant terms to identify numbers of participants, demographics, study methodologies, and sub-topics of interest.Keywords: data reuse, data discovery, data linkage, journal articles, text mining
Procedia PDF Downloads 11526597 Knowledge Management Processes as a Driver of Knowledge-Worker Performance in Public Health Sector of Pakistan
Authors: Shahid Razzaq
Abstract:
The governments around the globe have started taking into considerations the knowledge management dynamics while formulating, implementing, and evaluating the strategies, with or without the conscious realization, for the different public sector organizations and public policy developments. Health Department of Punjab province in Pakistan is striving to deliver quality healthcare services to the community through an efficient and effective service delivery system. Despite of this struggle some employee performance issues yet exists in the form of challenge to government. To overcome these issues department took several steps including HR strategies, use of technologies and focus of hard issues. Consequently, this study was attempted to highlight the importance of soft issue that is knowledge management in its true essence to tackle their performance issues. Knowledge management in public sector is quite an ignored area in the knowledge management-a growing multidisciplinary research discipline. Knowledge-based view of the firm theory asserts the knowledge is the most deliberate resource that can result in competitive advantage for an organization over the other competing organizations. In the context of our study it means for gaining employee performance, organizations have to increase the heterogeneous knowledge bases. The study uses the cross-sectional and quantitative research design. The data is collected from the knowledge workers of Health Department of Punjab, the biggest province of Pakistan. A total of 341 sample size is achieved. The SmartPLS 3 Version 2.6 is used for analyzing the data. The data examination revealed that knowledge management processes has a strong impact on knowledge worker performance. All hypotheses are accepted according to the results. Therefore, it can be summed up that to increase the employee performance knowledge management activities should be implemented. Health Department within province of Punjab introduces the knowledge management infrastructure and systems to make effective availability of knowledge for the service staff. This knowledge management infrastructure resulted in an increase in the knowledge management process in different remote hospitals, basic health units and care centers which resulted in greater service provisions to public. This study is to have theoretical and practical significances. In terms of theoretical contribution, this study is to establish the relationship between knowledge management and performance for the first time. In case of the practical contribution, this study is to give an insight to public sector organizations and government about role of knowledge management in employ performance. Therefore, public policymakers are strongly advised to implement the activities of knowledge management for enhancing the performance of knowledge workers. The current research validated the substantial role of knowledge management in persuading and creating employee arrogances and behavioral objectives. To the best of authors’ knowledge, this study contribute to the impact of knowledge management on employee performance as its originality.Keywords: employee performance, knowledge management, public sector, soft issues
Procedia PDF Downloads 14126596 Using Data Mining Technique for Scholarship Disbursement
Authors: J. K. Alhassan, S. A. Lawal
Abstract:
This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.Keywords: classification, data mining, decision tree, scholarship
Procedia PDF Downloads 37626595 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 32326594 Quantifying the Second-Level Digital Divide on Sub-National Level with a Composite Index
Authors: Vladimir Korovkin, Albert Park, Evgeny Kaganer
Abstract:
The paper studies the second-level digital divide (the one defined by the way how digital technology is used in everyday life) between regions of the Russian Federation. The paper offers a systemic review of literature on the measurement of the digital divide; based upon this it suggests a composite Digital Life Index, that captures the complex multi-dimensional character of the phenomenon. The model of the index studies separately the digital supply and demand across seven independent dimensions providing for 14 subindices. The Index is based on Internet-borne data, a distinction from traditional research approaches that rely on official statistics or surveys. Regression analysis is used to determine the relative importance of factors like income, human capital, and policy in determining the digital divide. The result of the analysis suggests that the digital divide is driven more by the differences in demand (defined by consumer competencies) than in supply; the role of income is insignificant, and the quality of human capital is the key determinant of the divide. The paper advances the existing methodological literature on the issue and can also inform practical decision-making regarding the strategies of national and regional digital development.Keywords: digital transformation, second-level digital divide, composite index, digital policy, regional development, Russia
Procedia PDF Downloads 18626593 Design of a Low-Cost, Portable, Sensor Device for Longitudinal, At-Home Analysis of Gait and Balance
Authors: Claudia Norambuena, Myissa Weiss, Maria Ruiz Maya, Matthew Straley, Elijah Hammond, Benjamin Chesebrough, David Grow
Abstract:
The purpose of this project is to develop a low-cost, portable sensor device that can be used at home for long-term analysis of gait and balance abnormalities. One area of particular concern involves the asymmetries in movement and balance that can accompany certain types of injuries and/or the associated devices used in the repair and rehabilitation process (e.g. the use of splints and casts) which can often increase chances of falls and additional injuries. This device has the capacity to monitor a patient during the rehabilitation process after injury or operation, increasing the patient’s access to healthcare while decreasing the number of visits to the patient’s clinician. The sensor device may thereby improve the quality of the patient’s care, particularly in rural areas where access to the clinician could be limited, while simultaneously decreasing the overall cost associated with the patient’s care. The device consists of nine interconnected accelerometer/ gyroscope/compass chips (9-DOF IMU, Adafruit, New York, NY). The sensors attach to and are used to determine the orientation and acceleration of the patient’s lower abdomen, C7 vertebra (lower neck), L1 vertebra (middle back), anterior side of each thigh and tibia, and dorsal side of each foot. In addition, pressure sensors are embedded in shoe inserts with one sensor (ESS301, Tekscan, Boston, MA) beneath the heel and three sensors (Interlink 402, Interlink Electronics, Westlake Village, CA) beneath the metatarsal bones of each foot. These sensors measure the distribution of the weight applied to each foot as well as stride duration. A small microntroller (Arduino Mega, Arduino, Ivrea, Italy) is used to collect data from these sensors in a CSV file. MATLAB is then used to analyze the data and output the hip, knee, ankle, and trunk angles projected on the sagittal plane. An open-source program Processing is then used to generate an animation of the patient’s gait. The accuracy of the sensors was validated through comparison to goniometric measurements (±2° error). The sensor device was also shown to have sufficient sensitivity to observe various gait abnormalities. Several patients used the sensor device, and the data collected from each represented the patient’s movements. Further, the sensors were found to have the ability to observe gait abnormalities caused by the addition of a small amount of weight (4.5 - 9.1 kg) to one side of the patient. The user-friendly interface and portability of the sensor device will help to construct a bridge between patients and their clinicians with fewer necessary inpatient visits.Keywords: biomedical sensing, gait analysis, outpatient, rehabilitation
Procedia PDF Downloads 28926592 Students’ and Clinical Supervisors’ Experiences of Occupational Therapy Practice Education: A Structured Critical Review
Authors: Hamad Alhamad, Catriona Khamisha, Emma Green, Yvonne Robb
Abstract:
Introduction: Practice education is a key component of occupational therapy education. This critical review aimed to explore students’ and clinical supervisors’ experiences of practice education, and to make recommendations for research. Method: The literature was systematically searched using five databases. Qualitative, quantitative and mixed methods studies were included. Critical Appraisal Skills Programme checklist for qualitative studies and Mixed Methods Assessment Tool for quantitative and mixed methods studies were used to assess study quality. Findings: Twenty-two studies with high quality scores were included: 16 qualitative, 3 quantitative and 3 mixed methods. Studies were conducted in Australia, Canada, USA and UK. During practice education, students learned professional skills, practical skills, clinical skills and problem-solving skills, and improved confidence and creativity. Supervisors had an opportunity to reflect on their practice and get experience of supervising students. However, clear objectives and expectations for students, and sufficient theoretical knowledge, preparation and resources for supervisors were required. Conclusion: Practice education provides different skills and experiences, necessary to become competent professionals; but some areas of practice education need to improve. Studies in non-western countries are needed to explore the perspectives of students and clinical supervisors in different cultures, to ensure the practice education models adopted are relevant.Keywords: occupational therapy, practice education, fieldwork, students, clinical supervisors
Procedia PDF Downloads 20326591 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Different Crops
Authors: M. M. Ali, Ahmed Al- Ani, Derek Eamus, Daniel K. Y. Tan
Abstract:
In this glasshouse study, we developed the new image-based non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. Plants were allowed to grow on nutrient media containing different P concentrations, i.e. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P as NaH2PO4). After 10 weeks of growth, plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. This data was further used in the linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using the image and morphological data. Our proposed non-destructive imaging method is precise in estimating P requirements of different crop species.Keywords: image-based techniques, leaf area, leaf P contents, linear discriminant analysis
Procedia PDF Downloads 38226590 Evaluation of Groundwater Quality in North-West Region of Punjab, India
Authors: Jeevan Jyoti Mohindroo, Umesh Kumar Garg
Abstract:
The district of Tarntaran is located25 km south of Amritsar city in Punjab State of Northwestern India. It is 5059 Sq. Km in area. It is surrounded by Amritsar in the North, Kapurthala in the East, and Ferozepur in the South and Pakistan in the West. Patti Town is a municipal council of the Tarntaran district of the Indian state of Punjab, located 45 km from Amritsar its geographical coordinates are 310 16' 51" north to 740 51' 25" East Longitude. The town spreads over an area of 50sq. Km. Moisture content is very less in the air, falling within the semiarid region and frequently facing water scarcity as well as water quality problems. The major sources of employment are agriculture, horticulture and animal husbandry engaging almost 80% of the workforce. Water samples are collected from 400 locations in 20 villages on the Patti –Khem Karan highway with 20 samples from each village, and were subjected to analysis of chemical characteristics. The type of water that predominates in the study area is Ca-Mg-HCO3 type, based on hydro-chemical analysis. Besides, suitability of water for irrigation is evaluated based on the sodium adsorption ratio (SAR), residual sodium carbonate, sodium percent and salinity hazard. Other Physico-chemical parameters such as pH, TDS, conductance, etc. were also determined using a water analysis kit. Analysis of water samples for heavy metal analysis was also carried out in the present study.Keywords: groundwater, chemical classification, SAR, RSC, USSL diagram
Procedia PDF Downloads 197