Search results for: information value method
24043 Design and Development of a Computerized Medical Record System for Hospitals in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
A computerized medical record system is a collection of medical information about a person that is stored on a computer. One principal problem of most hospitals in rural areas is using the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This data mining application is to be designed using a structured system analysis and design method which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the design and implementation of a computerized medical record system. This computerized system will replace the file management system and help to quickly retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: programming, data, software development, innovation
Procedia PDF Downloads 8724042 Health Literacy: Collaboration between Clinician and Patient
Authors: Cathy Basterfield
Abstract:
Issue: To engage in one’s own health care, health professionals need to be aware of an individual’s specific skills and abilities for best communication. One of the most discussed is health literacy. One of the assumed skills and abilities for adults is an individuals’ health literacy. Background: A review of publicly available health content appears to assume all adult readers will have a broad and full capacity to read at a high level of literacy, often at a post-school education level. Health information writers and clinicians need to recognise one critical area for why there may be little or no change in a person’s behaviour, or no-shows to appointments. Perhaps unintentionally, they are miscommunicating with the majority of the adult population. Health information contains many literacy domains. It usually includes technical medical terms or jargon. Many fact sheets and other information require scientific literacy with or without specific numerical literacy. It may include graphs, percentages, timing, distance, or weights. Each additional word or concept in these domains decreases the readers' ability to meaningfully read, understand and know what to do with the information. An attempt to begin to read the heading where long or unfamiliar words are used will reduce the readers' motivation to attempt to read. Critically people who have low literacy are overwhelmed when pages are covered with lots of words. People attending a health environment may be unwell or anxious about a diagnosis. These make it harder to read, understand and know what to do with the information. But access to health information must consider an even wider range of adults, including those with poor school attainment, migrants, and refugees. It is also homeless people, people with mental health illnesses, or people who are ageing. People with low literacy also may include people with lifelong disabilities, people with acquired disabilities, people who read English as a second (or third) language, people who are Deaf, or people who are vision impaired. Outcome: This paper will discuss Easy English, which is developed for adults. It uses the audiences’ everyday words, short sentences, short words, and no jargon. It uses concrete language and concrete, specific images to support the text. It has been developed in Australia since the mid-2000s. This paper will showcase various projects in the health domain which use Easy English to improve the understanding and functional use of written information for the large numbers of adults in our communities who do not have the health literacy to manage a range of day to day reading tasks. See examples from consent forms, fact sheets and choice options, instructions, and other functional documents, where Easy English has been developed. This paper will ask individuals to reflect on their own work practice and consider what written information must be available in Easy English. It does not matter how cutting-edge a new treatment is; when adults can not read or understand what it is about and the positive and negative outcomes, they are less likely to be engaged in their own health journey.Keywords: health literacy, inclusion, Easy English, communication
Procedia PDF Downloads 12524041 Analyzing the Use of Augmented and Virtual Reality to Teach Social Skills to Students with Autism
Authors: Maggie Mosher, Adam Carreon, Sean Smith
Abstract:
A systematic literature review was conducted to explore the evidence base on the use of augmented reality (AR), virtual reality (VR), mixed reality (MR), and extended reality (XR) to present social skill instruction to school-age students with autism spectrum disorder (ASD). Specifically, the systematic review focus was on a. the participants and intervention agents using AR, VR, MR, and XR for social skill acquisition b. the social skills taught through these mediums and c. the social validity measures (i.e., goals, procedures, and outcomes) reported in these studies. Forty-one articles met the inclusion criteria. Researchers in six studies taught social skills to students through AR, in 27 studies through non-immersive VR, and in 10 studies through immersive VR. No studies used MR or XR. The primary targeted social skills were relationship skills, emotion recognition, social awareness, cooperation, and executive functioning. An intervention to improve many social skills was implemented by 73% of researchers, 17% taught a single skill, and 10% did not clearly state the targeted skill. The intervention was considered effective in 26 of the 41 studies (63%), not effective in four studies (10%), and 11 studies (27%) reported mixed results. No researchers reported information for all 17 social validity indicators. The social validity indicators reported by researchers ranged from two to 14. Social validity measures on the feelings toward and use of the technology were provided in 22 studies (54%). Findings indicated both AR and VR are promising platforms for providing social skill instruction to students with ASD. Studies utilizing this technology show a number of social validity indicators. However, the limited information provided on the various interventions, participant characteristics, and validity measures, offers insufficient evidence of the impact of these technologies in teaching social skills to students with ASD. Future research should develop a protocol for training treatment agents to assess the role of different variables (i.e., whether agents are customizing content, monitoring student learning, using intervention specific vocabulary in their day to day instruction). Sustainability may be increased by providing training in the technology to both treatment agents and participants. Providing scripts of instruction occurring within the intervention would provide the needed information to determine the primary method of teaching within the intervention. These variables play a role in maintenance and generalization of the social skills. Understanding the type of feedback provided would help researchers determine if students were able to feel rewarded for progressing through the scenarios or if students require rewarding aspects within the intervention (i.e., badges, trophies). AR has the potential to generalize instruction and VR has the potential for providing a practice environment for performance deficits. Combining these two technologies into a mixed reality intervention may provide a more cohesive and effective intervention.Keywords: autism, augmented reality, social and emotional learning, social skills, virtual reality
Procedia PDF Downloads 10924040 Design of Intelligent Scaffolding Learning Management System for Vocational Education
Authors: Seree Chadcham, Niphon Sukvilai
Abstract:
This study is the research and development which is intended to: 1) design of the Intelligent Scaffolding Learning Management System (ISLMS) for vocational education, 2) assess the suitability of the Design of Intelligent Scaffolding Learning Management System for Vocational Education. Its methods are divided into 2 phases. Phase 1 is the design of the ISLMS for Vocational Education and phase 2 is the assessment of the suitability of the design. The samples used in this study are work done by 15 professionals in the field of Intelligent Scaffolding, Learning Management System, Vocational Education, and Information and Communication Technology in education selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. The results showed that the ISLMS for vocational education consists of 2 main components which are: 1) the Intelligent Learning Management System for Vocational Education, 2) the Intelligent Scaffolding Management System. The result of the system suitability assessment from the professionals is in the highest range.Keywords: intelligent, scaffolding, learning management system, vocational education
Procedia PDF Downloads 79524039 The Influence of Water on the Properties of Cellulose Fibre Insulation
Authors: Pablo Lopez Hurtado, Antroine Rouilly, Virginie Vandenbossche
Abstract:
Cellulose fibre insulation is an eco-friendly building material made from recycled paper fibres, treated with borates for fungal and fire resistance. It is comparable in terms of thermal and acoustic performance to mineral wool insulation and other insulation materials based on non-renewable resources. The main method of application consists in separating and blowing the fibres in attics or closed wall cavities. Another method, known as the “wet spray method” is gaining interest. With this method the fibres are projected with pulverized water, which stick to the wall cavities. The issue with the wet spray technique is that the water dosage could be difficult to control. A high water dosage implies not only a longer drying time, depending on ambient conditions, but also a change in the performance of the material itself. In our work we studied the thermal and mechanical properties of wet spray-cellulose insulation in order to understand how water dosage could affect these properties. The material was first characterized to study the chemical and physical properties of the fibres. Then representative samples of wet sprayed cellulose with varying applied water dosage were subject to thermal conductivity and compression testing in order to better understand how changes in the fibres induced by drying can affect these properties.Keywords: cellulose fibre, recycled paper, moisture sorption, thermal insulation
Procedia PDF Downloads 30324038 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection
Authors: S. Delgado, C. Cerrada, R. S. Gómez
Abstract:
This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.Keywords: voxelization, GPU acceleration, computer graphics, compute shaders
Procedia PDF Downloads 7224037 Analysis of Heat Exchanger Area of Two Stage Cascade Refrigeration System Using Taguchi Methodology
Authors: A. D. Parekh
Abstract:
The present work describes relative contributions of operating parameters on required heat transfer area of three heat exchangers viz. evaporator, condenser and cascade condenser of two stage R404A-R508B cascade refrigeration system using Taguchi method. The operating parameters considered in present study includes (1) condensing temperature of high temperature cycle and low temperature cycle (2) evaporating temperature of low temperature cycle (3) degree of superheating in low temperature cycle (4) refrigerating effect. Heat transfer areas of three heat exchangers are studied with variation of above operating parameters and also optimum working levels of each operating parameter has been obtained for minimum heat transfer area of each heat exchanger using Taguchi method. The analysis using Taguchi method reveals that evaporating temperature of low temperature cycle and refrigerating effect contribute relatively largely on the area of evaporator. Condenser area is mainly influenced by both condensing temperature of high temperature cycle and refrigerating effect. Area of cascade condenser is mainly affected by refrigerating effect and the effects of other operating parameters are minimal.Keywords: cascade refrigeration system, Taguchi method, heat transfer area, ANOVA, optimal solution
Procedia PDF Downloads 38424036 Functional Electrical Stimulator and Neuromuscular Electro Stimulator System Analysis for Foot Drop
Authors: Gül Fatma Türker, Hatice Akman
Abstract:
Portable muscle stimulators for real-time applications has first introduced by Liberson in 1961. Now these systems has been advanced. In this study, FES (Functional Electrical Stimulator) and NMES (Neuromuscular Electrostimulator) systems are analyzed through their hardware and their quality of life improvements for foot drop patients. FES and NMES systems are used for people whose leg muscles and leg neural connections are healty but not able to walk properly because of their injured central nervous system like spinal cord injuries. These systems are used to stimulate neurons or muscles by getting information from other movements and programming these stimulations to get natural walk and it is accepted as a rehabilitation method for the correction of drop foot. This systems support person to approach natural form of walking. Foot drop is characterized by steppage gait. It is a gait abnormality. This systems helps to person for plantar and dorse reflection movements which are hard to done for foot drop patients.Keywords: FES, foot drop, NMES, stimulator
Procedia PDF Downloads 38824035 Modeling of Crack Propagation Path in Concrete with Coarse Trapezoidal Aggregates by Boundary Element Method
Authors: Chong Wang, Alexandre Urbano Hoffmann
Abstract:
Interaction between a crack and a trapezoidal aggregate in a single edge notched concrete beam is simulated using boundary element method with an automatic crack extension program. The stress intensity factors of the growing crack are obtained from the J-integral. Three crack extension paths: deflecting around the particulate, growing along the interface and penetrating into the particulate are achieved in terms of the mismatch state of mechanical characteristics of matrix and the particulate. The toughening is also given by the ratio of stress intensity factors. The results reveal that as stress shielding occurs, toughening is obtained when the crack is approaching to a stiff and strong aggregate weakly bonded to a relatively soft matrix. The present work intends to help for the design of aggregate reinforced concretes.Keywords: aggregate concrete, boundary element method, two-phase composite, crack extension path, crack/particulate interaction
Procedia PDF Downloads 42624034 Improving Grade Control Turnaround Times with In-Pit Hyperspectral Assaying
Authors: Gary Pattemore, Michael Edgar, Andrew Job, Marina Auad, Kathryn Job
Abstract:
As critical commodities become more scarce, significant time and resources have been used to better understand complicated ore bodies and extract their full potential. These challenging ore bodies provide several pain points for geologists and engineers to overcome, poor handling of these issues flows downs stream to the processing plant affecting throughput rates and recovery. Many open cut mines utilise blast hole drilling to extract additional information to feed back into the modelling process. This method requires samples to be collected during or after blast hole drilling. Samples are then sent for assay with turnaround times varying from 1 to 12 days. This method is time consuming, costly, requires human exposure on the bench and collects elemental data only. To address this challenge, research has been undertaken to utilise hyperspectral imaging across a broad spectrum to scan samples, collars or take down hole measurements for minerals and moisture content and grade abundances. Automation of this process using unmanned vehicles and on-board processing reduces human in pit exposure to ensure ongoing safety. On-board processing allows data to be integrated into modelling workflows with immediacy. The preliminary results demonstrate numerous direct and indirect benefits from this new technology, including rapid and accurate grade estimates, moisture content and mineralogy. These benefits allow for faster geo modelling updates, better informed mine scheduling and improved downstream blending and processing practices. The paper presents recommendations for implementation of the technology in open cut mining environments.Keywords: grade control, hyperspectral scanning, artificial intelligence, autonomous mining, machine learning
Procedia PDF Downloads 11324033 Synthesis of 5-Substituted 1H-Tetrazoles in Deep Eutectic Solvent
Authors: Swapnil A. Padvi, Dipak S. Dalal
Abstract:
The chemistry of tetrazoles has been grown tremendously in the past few years because tetrazoles are important and useful class of heterocyclic compounds which have a widespread application such as anticancer, antimicrobial, analgesics, antibacterial, antifungal, antihypertensive, and anti-allergic drugs in medicinal chemistry. Furthermore, tetrazoles have application in material sciences as explosives, rocket propellants, and in information recording systems. In addition to this, they have a wide range of application in coordination chemistry as a ligand. Deep eutectic solvents (DES) have emerged over the current decade as a novel class of green reaction media and applied in various fields of sciences because of their unique physical and chemical properties similar to the ionic liquids such as low vapor pressure, non-volatility, high thermal stability and recyclability. In addition, the reactants of DES are cheaply available, low-toxic, and biodegradable, which makes them predominantly required for large-scale applications effectively in industrial production. Herein we report the [2+3] cycloaddition reaction of organic nitriles with sodium azide affords the corresponding 5-substituted 1H-tetrazoles in six different types of choline chloride based deep eutectic solvents under mild reaction condition. Choline chloride: ZnCl2 (1:2) showed the best results for the synthesis of 5-substituted 1 H-tetrazoles. This method reduces the disadvantages such as: the use of toxic metals and expensive reagents, drastic reaction conditions and the presence of dangerous hydrazoic acid. The approach provides environment-friendly, short reaction times, good to excellent yields; safe process and simple workup make this method an attractive and useful contribution to present green organic synthesis of 5-substituted-1H-tetrazoles. All synthesized compounds were characterized by IR, 1H NMR, 13C NMR and Mass spectroscopy. DES can be recovered and reused three times with very little loss in activity.Keywords: click chemistry, choline chloride, green chemistry, deep eutectic solvent, tetrazoles
Procedia PDF Downloads 23124032 Applied Complement of Probability and Information Entropy for Prediction in Student Learning
Authors: Kennedy Efosa Ehimwenma, Sujatha Krishnamoorthy, Safiya Al‑Sharji
Abstract:
The probability computation of events is in the interval of [0, 1], which are values that are determined by the number of outcomes of events in a sample space S. The probability Pr(A) that an event A will never occur is 0. The probability Pr(B) that event B will certainly occur is 1. This makes both events A and B a certainty. Furthermore, the sum of probabilities Pr(E₁) + Pr(E₂) + … + Pr(Eₙ) of a finite set of events in a given sample space S equals 1. Conversely, the difference of the sum of two probabilities that will certainly occur is 0. This paper first discusses Bayes, the complement of probability, and the difference of probability for occurrences of learning-events before applying them in the prediction of learning objects in student learning. Given the sum of 1; to make a recommendation for student learning, this paper proposes that the difference of argMaxPr(S) and the probability of student-performance quantifies the weight of learning objects for students. Using a dataset of skill-set, the computational procedure demonstrates i) the probability of skill-set events that have occurred that would lead to higher-level learning; ii) the probability of the events that have not occurred that requires subject-matter relearning; iii) accuracy of the decision tree in the prediction of student performance into class labels and iv) information entropy about skill-set data and its implication on student cognitive performance and recommendation of learning.Keywords: complement of probability, Bayes’ rule, prediction, pre-assessments, computational education, information theory
Procedia PDF Downloads 16124031 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis
Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith
Abstract:
Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.Keywords: information technology, electronic patient records, digitisation, paperless healthcare
Procedia PDF Downloads 9224030 Parametric Analysis and Optimal Design of Functionally Graded Plates Using Particle Swarm Optimization Algorithm and a Hybrid Meshless Method
Authors: Foad Nazari, Seyed Mahmood Hosseini, Mohammad Hossein Abolbashari, Mohammad Hassan Abolbashari
Abstract:
The present study is concerned with the optimal design of functionally graded plates using particle swarm optimization (PSO) algorithm. In this study, meshless local Petrov-Galerkin (MLPG) method is employed to obtain the functionally graded (FG) plate’s natural frequencies. Effects of two parameters including thickness to height ratio and volume fraction index on the natural frequencies and total mass of plate are studied by using the MLPG results. Then the first natural frequency of the plate, for different conditions where MLPG data are not available, is predicted by an artificial neural network (ANN) approach which is trained by back-error propagation (BEP) technique. The ANN results show that the predicted data are in good agreement with the actual one. To maximize the first natural frequency and minimize the mass of FG plate simultaneously, the weighted sum optimization approach and PSO algorithm are used. However, the proposed optimization process of this study can provide the designers of FG plates with useful data.Keywords: optimal design, natural frequency, FG plate, hybrid meshless method, MLPG method, ANN approach, particle swarm optimization
Procedia PDF Downloads 36724029 Analysis of Ancient and Present Lightning Protection Systems of Large Heritage Stupas in Sri Lanka
Authors: J.R.S.S. Kumara, M.A.R.M. Fernando, S.Venkatesh, D.K. Jayaratne
Abstract:
Protection of heritage monuments against lightning has become extremely important as far as their historical values are concerned. When such structures are large and tall, the risk of lightning initiated from both cloud and ground can be high. This paper presents a lightning risk analysis of three giant stupas in Anuradhapura era (fourth century BC onwards) in Sri Lanka. The three stupas are Jethawaaramaya (269-296 AD), Abayagiriya (88-76 BC) and Ruwanweliseya (161-137 BC), the third, fifth and seventh largest ancient structures in the world. These stupas are solid brick structures consisting of a base, a near hemispherical dome and a conical spire on the top. The ancient stupas constructed with a dielectric crystal on the top and connected to the ground through a conducting material, was considered as the hypothesis for their original lightning protection technique. However, at present, all three stupas are protected with Franklin rod type air termination systems located on top of the spire. First, a risk analysis was carried out according to IEC 62305 by considering the isokeraunic level of the area and the height of the stupas. Then the standard protective angle method and rolling sphere method were used to locate the possible touching points on the surface of the stupas. The study was extended to estimate the critical current which could strike on the unprotected areas of the stupas. The equations proposed by (Uman 2001) and (Cooray2007) were used to find the striking distances. A modified version of rolling sphere method was also applied to see the effects of upward leaders. All these studies were carried out for two scenarios: with original (i.e. ancient) lightning protection system and with present (i.e. new) air termination system. The field distribution on the surface of the stupa in the presence of a downward leader was obtained using finite element based commercial software COMSOL Multiphysics for further investigations of lightning risks. The obtained results were analyzed and compared each other to evaluate the performance of ancient and new lightning protection methods and identify suitable methods to design lightning protection systems for stupas. According to IEC standards, all three stupas with new and ancient lightning protection system has Level IV protection as per protection angle method. However according to rolling sphere method applied with Uman’s equation protection level is III. The same method applied with Cooray’s equation always shows a high risk with respect to Uman’s equation. It was found that there is a risk of lightning strikes on the dome and square chamber of the stupa, and the corresponding critical current values were different with respect to the equations used in the rolling sphere method and modified rolling sphere method.Keywords: Stupa, heritage, lightning protection, rolling sphere method, protection level
Procedia PDF Downloads 25224028 Mathematics Anxiety and Attitude among Nigerian University Library and Information Science Undergraduate Students
Authors: Fredrick Olatunji Ajegbomogun, Clement Ola Adekoya
Abstract:
Mathematics has, for ages, been an essential subject in the education curriculum across the globe. The word mathematics scares the majority of undergraduate students and even more library and information science (LIS) students who have not seen the pertinence of the subject to their academic pursuits. This study investigated mathematics anxiety and attitudes among LIS undergraduate students in Nigerian universities. The study adopted a descriptive survey research design. Multi-stage and convenient sampling techniques were used for the study. Data were collected using a questionnaire and analyzed using descriptive statistical tools. It was found that mathematics is important in LIS education. The students displayed a high level of anxiety toward mathematics. The students have a negative attitude toward mathematics. However, the hypotheses tested revealed that while the LIS female undergraduate students displayed low levels of anxiety and a positive attitude toward mathematics, the level of anxiety of the male undergraduate students was high, and their attitude toward mathematics was negative. It was recommended that LIS undergraduate students develop a positive attitude towards mathematics and appreciate that the paradigm shift in the practice of librarianship is towards mathematics as a way of developing technological tools (hardware and software) to facilitate the effective delivery of library services.Keywords: anxiety, attitude, library and information science, mathematics anxiety, undergraduate students, Nigerian universities
Procedia PDF Downloads 15724027 Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders
Authors: Hao-Hsi Tseng, Hsin-Yun Lee
Abstract:
The Most Advantageous Tender (MAT) has been criticized for its susceptibility to dictatorial situations and for its processing of same score, same rank issues. This study applies the four criteria from Arrow's Impossibility Theorem to construct a mechanism for revealing illegitimate scores in scoring methods. While commonly be used to improve on problems resulting from extreme scores, ranking methods hide significant defects, adversely affecting selection fairness. To address these shortcomings, this study relies mainly on the overall evaluated score method, using standardized scores plus normal cumulative distribution function conversion to calculate the evaluation of vender preference. This allows for free score evaluations, which reduces the influence of dictatorial behavior and avoiding same score, same rank issues. Large-scale simulations confirm that this method outperforms currently used methods using the Impossibility Theorem.Keywords: Arrow’s impossibility theorem, cumulative normal distribution function, most advantageous tender, scoring method
Procedia PDF Downloads 46424026 Study of Electron Cyclotron Resonance Acceleration by Cylindrical TE₀₁₁ Mode
Authors: Oswaldo Otero, Eduardo A. Orozco, Ana M. Herrera
Abstract:
In this work, we present results from analytical and numerical studies of the electron acceleration by a TE₀₁₁ cylindrical microwave mode in a static homogeneous magnetic field under electron cyclotron resonance (ECR) condition. The stability of the orbits is analyzed using the particle orbit theory. In order to get a better understanding of the interaction wave-particle, we decompose the azimuthally electric field component as the superposition of right and left-hand circular polarization standing waves. The trajectory, energy and phase-shift of the electron are found through a numerical solution of the relativistic Newton-Lorentz equation in a finite difference method by the Boris method. It is shown that an electron longitudinally injected with an energy of 7 keV in a radial position r=Rc/2, being Rc the cavity radius, is accelerated up to energy of 90 keV by an electric field strength of 14 kV/cm and frequency of 2.45 GHz. This energy can be used to produce X-ray for medical imaging. These results can be used as a starting point for study the acceleration of electrons in a magnetic field changing slowly in time (GYRAC), which has some important applications as the electron cyclotron resonance ion proton accelerator (ECR-IPAC) for cancer therapy and to control plasma bunches with relativistic electrons.Keywords: Boris method, electron cyclotron resonance, finite difference method, particle orbit theory, X-ray
Procedia PDF Downloads 15924025 Development of Building Information Modeling for Cultural Heritage: The Case of West Theater in Gadara (Umm Qais), Jordan
Authors: Amal Alatar
Abstract:
The architectural legacy is considered a significant factor, which left its features on the shape of buildings and historical and archaeological sites all over the world. In this framework, this paper focuses on Umm Qais town, located in Northern Jordan, which includes archaeological remains of the ancient Decapolis city of Gadara, still the witness of the originality and architectural identity of the city. 3D modeling is a public asset and a valuable resource for cultural heritage. This technique allows the possibility to make accurate representations of objects, structures, and surfaces. Hence, these representations increase valuable assets when thinking about cultural heritage. The Heritage Building Information Modeling (HBIM) is considered an effective tool to represent information on Cultural Heritage (CH) which can be used for documentation, restoration, conservation, presentation, and research purposes. Therefore, this paper focus on the interdisciplinary project of the virtualization of the West Theater in Gadara (Umm Qais) for 3D documentation and structural studies. The derived 3D model of the cultural heritage is the basis for further archaeological studies; the challenges of the work stay in the acquisition, processing, and integration of the multi-resolution data as well as their interactive visualization.Keywords: archaeology, 3D modeling, Umm Qais, culture heritage, Jordan
Procedia PDF Downloads 10124024 The Analysis of Own Signals of PM Electrical Machines – Example of Eccentricity
Authors: Marcin Baranski
Abstract:
This article presents a vibration diagnostic method designed for permanent magnets (PM) traction motors. Those machines are commonly used in traction drives of electrical vehicles. Specific structural properties of machines excited by permanent magnets are used in this method - electromotive force (EMF) generated due to vibrations. This work presents: field-circuit model, results of static tests, results of calculations and simulations.Keywords: electrical vehicle, permanent magnet, traction drive, vibrations, electrical machine, eccentricity
Procedia PDF Downloads 62824023 Modern Proteomics and the Application of Machine Learning Analyses in Proteomic Studies of Chronic Kidney Disease of Unknown Etiology
Authors: Dulanjali Ranasinghe, Isuru Supasan, Kaushalya Premachandra, Ranjan Dissanayake, Ajith Rajapaksha, Eustace Fernando
Abstract:
Proteomics studies of organisms are considered to be significantly information-rich compared to their genomic counterparts because proteomes of organisms represent the expressed state of all proteins of an organism at a given time. In modern top-down and bottom-up proteomics workflows, the primary analysis methods employed are gel–based methods such as two-dimensional (2D) electrophoresis and mass spectrometry based methods. Machine learning (ML) and artificial intelligence (AI) have been used increasingly in modern biological data analyses. In particular, the fields of genomics, DNA sequencing, and bioinformatics have seen an incremental trend in the usage of ML and AI techniques in recent years. The use of aforesaid techniques in the field of proteomics studies is only beginning to be materialised now. Although there is a wealth of information available in the scientific literature pertaining to proteomics workflows, no comprehensive review addresses various aspects of the combined use of proteomics and machine learning. The objective of this review is to provide a comprehensive outlook on the application of machine learning into the known proteomics workflows in order to extract more meaningful information that could be useful in a plethora of applications such as medicine, agriculture, and biotechnology.Keywords: proteomics, machine learning, gel-based proteomics, mass spectrometry
Procedia PDF Downloads 15124022 The Impact of Motivation on Employee Performance in South Korea
Authors: Atabong Awung Lekeazem
Abstract:
The purpose of this paper is to identify the impact or role of incentives on employee’s performance with a particular emphasis on Korean workers. The process involves defining and explaining the different types of motivation. In defining them, we also bring out the difference between the two major types of motivations. The second phase of the paper shall involve gathering data/information from a sample population and then analyzing the data. In the analysis, we shall get to see the almost similar mentality or value which Koreans attach to motivation, which a slide different view coming only from top management personnel. The last phase shall have us presenting the data and coming to a conclusion from which possible knowledge on how managers and potential managers can ignite the best out of their employees.Keywords: motivation, employee’s performance, Korean workers, business information systems
Procedia PDF Downloads 41424021 Modified Fuzzy Delphi Method to Incorporate Healthcare Stakeholders’ Perspectives in Selecting Quality Improvement Projects’ Criteria
Authors: Alia Aldarmaki, Ahmad Elshennawy
Abstract:
There is a global shift in healthcare systems’ emphasizing engaging different stakeholders in selecting quality improvement initiatives and incorporating their preferences to improve the healthcare efficiency and outcomes. Although experts bring scientific knowledge based on the scientific model and their personal experience, other stakeholders can bring new insights and information into the decision-making process. This study attempts to explore the impact of incorporating different stakeholders’ preference in identifying the most significant criteria that should be considered in healthcare for electing the improvement projects. A Framework based on a modified Fuzzy Delphi Method (FDM) was built. In addition to, the subject matter experts, doctors/physicians, nurses, administrators, and managers groups contribute to the selection process. The research identifies potential criteria for evaluating projects in healthcare, then utilizes FDM to capture expertise knowledge. The first round in FDM is intended to validate the identified list of criteria from experts; which includes collecting additional criteria from experts that the literature might have overlooked. When an acceptable level of consensus has been reached, a second round is conducted to obtain experts’ and other related stakeholders’ opinions on the appropriate weight of each criterion’s importance using linguistic variables. FDM analyses eliminate or retain the criteria to produce a final list of the critical criteria to select improvement projects in healthcare. Finally, reliability and validity were investigated using Cronbach’s alpha and factor analysis, respectively. Two case studies were carried out in a public hospital in the United Arab Emirates to test the framework. Both cases demonstrate that even though there were common criteria between the experts and the stakeholders, still stakeholders’ perceptions bring additional critical criteria into the evaluation process, which can impact the outcomes. Experts selected criteria related to strategical and managerial aspects, while the other participants preferred criteria related to social aspects such as health and safety and patients’ satisfaction. The health and safety criterion had the highest important weight in both cases. The analysis showed that Cronbach’s alpha value is 0.977 and all criteria have factor loading greater than 0.3. In conclusion, the inclusion of stakeholders’ perspectives is intended to enhance stakeholders’ engagement, improve transparency throughout the decision process, and take robust decisions.Keywords: Fuzzy Delphi Method, fuzzy number, healthcare, stakeholders
Procedia PDF Downloads 12824020 A Decision Tree Approach to Estimate Permanent Residents Using Remote Sensing Data in Lebanese Municipalities
Authors: K. Allaw, J. Adjizian Gerard, M. Chehayeb, A. Raad, W. Fahs, A. Badran, A. Fakherdin, H. Madi, N. Badaro Saliba
Abstract:
Population estimation using Geographic Information System (GIS) and remote sensing faces many obstacles such as the determination of permanent residents. A permanent resident is an individual who stays and works during all four seasons in his village. So, all those who move towards other cities or villages are excluded from this category. The aim of this study is to identify the factors affecting the percentage of permanent residents in a village and to determine the attributed weight to each factor. To do so, six factors have been chosen (slope, precipitation, temperature, number of services, time to Central Business District (CBD) and the proximity to conflict zones) and each one of those factors has been evaluated using one of the following data: the contour lines map of 50 m, the precipitation map, four temperature maps and data collected through surveys. The weighting procedure has been done using decision tree method. As a result of this procedure, temperature (50.8%) and percentage of precipitation (46.5%) are the most influencing factors.Keywords: remote sensing, GIS, permanent residence, decision tree, Lebanon
Procedia PDF Downloads 13324019 Integrating Artificial Neural Network and Taguchi Method on Constructing the Real Estate Appraisal Model
Authors: Mu-Yen Chen, Min-Hsuan Fan, Chia-Chen Chen, Siang-Yu Jhong
Abstract:
In recent years, real estate prediction or valuation has been a topic of discussion in many developed countries. Improper hype created by investors leads to fluctuating prices of real estate, affecting many consumers to purchase their own homes. Therefore, scholars from various countries have conducted research in real estate valuation and prediction. With the back-propagation neural network that has been popular in recent years and the orthogonal array in the Taguchi method, this study aimed to find the optimal parameter combination at different levels of orthogonal array after the system presented different parameter combinations, so that the artificial neural network obtained the most accurate results. The experimental results also demonstrated that the method presented in the study had a better result than traditional machine learning. Finally, it also showed that the model proposed in this study had the optimal predictive effect, and could significantly reduce the cost of time in simulation operation. The best predictive results could be found with a fewer number of experiments more efficiently. Thus users could predict a real estate transaction price that is not far from the current actual prices.Keywords: artificial neural network, Taguchi method, real estate valuation model, investors
Procedia PDF Downloads 48924018 Efficient Principal Components Estimation of Large Factor Models
Authors: Rachida Ouysse
Abstract:
This paper proposes a constrained principal components (CnPC) estimator for efficient estimation of large-dimensional factor models when errors are cross sectionally correlated and the number of cross-sections (N) may be larger than the number of observations (T). Although principal components (PC) method is consistent for any path of the panel dimensions, it is inefficient as the errors are treated to be homoskedastic and uncorrelated. The new CnPC exploits the assumption of bounded cross-sectional dependence, which defines Chamberlain and Rothschild’s (1983) approximate factor structure, as an explicit constraint and solves a constrained PC problem. The CnPC method is computationally equivalent to the PC method applied to a regularized form of the data covariance matrix. Unlike maximum likelihood type methods, the CnPC method does not require inverting a large covariance matrix and thus is valid for panels with N ≥ T. The paper derives a convergence rate and an asymptotic normality result for the CnPC estimators of the common factors. We provide feasible estimators and show in a simulation study that they are more accurate than the PC estimator, especially for panels with N larger than T, and the generalized PC type estimators, especially for panels with N almost as large as T.Keywords: high dimensionality, unknown factors, principal components, cross-sectional correlation, shrinkage regression, regularization, pseudo-out-of-sample forecasting
Procedia PDF Downloads 15024017 Sparse Signal Restoration Algorithm Based on Piecewise Adaptive Backtracking Orthogonal Least Squares
Authors: Linyu Wang, Jiahui Ma, Jianhong Xiang, Hanyu Jiang
Abstract:
the traditional greedy compressed sensing algorithm needs to know the signal sparsity when recovering the signal, but the signal sparsity in the practical application can not be obtained as a priori information, and the recovery accuracy is low, which does not meet the needs of practical application. To solve this problem, this paper puts forward Piecewise adaptive backtracking orthogonal least squares algorithm. The algorithm is divided into two stages. In the first stage, the sparsity pre-estimation strategy is adopted, which can quickly approach the real sparsity and reduce time consumption. In the second stage iteration, the correction strategy and adaptive step size are used to accurately estimate the sparsity, and the backtracking idea is introduced to improve the accuracy of signal recovery. Through experimental simulation, the algorithm can accurately recover the estimated signal with fewer iterations when the sparsity is unknown.Keywords: compressed sensing, greedy algorithm, least square method, adaptive reconstruction
Procedia PDF Downloads 14824016 The Economic Valuation of Public Support Ecosystem: A Contingent Valuation Study in Setiu Wetland, Terengganu Malaysia
Authors: Elmira Shamshity
Abstract:
This study aimed to explore the economic approach for the Setiu wetland evaluation as a future protection strategy. A questionnaire survey was used based on the single-bounded dichotomous choice, contingent valuation method to differentiate individuals’ Willingness to Pay (WTP) for the conservation of the Setiu wetland. The location of study was Terengganu province in Malaysia. The results of the random questionnaire survey showed that protection of Setiu ecosystem is important to the indigenous community. The mean WTP for protection of ecosystem Setiu wetland was 12.985 Ringgit per month per household for 10 years. There was significant variation in the stated amounts of WTP based on the respondents’ knowledge, household income, educational level, and the bid amounts. The findings of this study may help improving understanding the WTP of indigenous people for the protection of wetland, and providing useful information for policy makers to design an effective program of ecosystem protection.Keywords: willingness to pay, ecosystem, setiu wetland, Terengganu Malaysia
Procedia PDF Downloads 60524015 Use of In-line Data Analytics and Empirical Model for Early Fault Detection
Authors: Hyun-Woo Cho
Abstract:
Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.Keywords: batch process, monitoring, measurement, kernel method
Procedia PDF Downloads 32324014 A Survey of Semantic Integration Approaches in Bioinformatics
Authors: Chaimaa Messaoudi, Rachida Fissoune, Hassan Badir
Abstract:
Technological advances of computer science and data analysis are helping to provide continuously huge volumes of biological data, which are available on the web. Such advances involve and require powerful techniques for data integration to extract pertinent knowledge and information for a specific question. Biomedical exploration of these big data often requires the use of complex queries across multiple autonomous, heterogeneous and distributed data sources. Semantic integration is an active area of research in several disciplines, such as databases, information-integration, and ontology. We provide a survey of some approaches and techniques for integrating biological data, we focus on those developed in the ontology community.Keywords: biological ontology, linked data, semantic data integration, semantic web
Procedia PDF Downloads 449