Search results for: computerized%20scoring%20system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 155

Search results for: computerized%20scoring%20system

65 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs

Authors: Wanessa G. Oliveira, Fernando C. Capovilla

Abstract:

Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).

Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology

Procedia PDF Downloads 308
64 Reversible Information Hitting in Encrypted JPEG Bitstream by LSB Based on Inherent Algorithm

Authors: Vaibhav Barve

Abstract:

Reversible information hiding has drawn a lot of interest as of late. Being reversible, we can restore unique computerized data totally. It is a plan where mystery data is put away in digital media like image, video, audio to maintain a strategic distance from unapproved access and security reason. By and large JPEG bit stream is utilized to store this key data, first JPEG bit stream is encrypted into all around sorted out structure and then this secret information or key data is implanted into this encrypted region by marginally changing the JPEG bit stream. Valuable pixels suitable for information implanting are computed and as indicated by this key subtle elements are implanted. In our proposed framework we are utilizing RC4 algorithm for encrypting JPEG bit stream. Encryption key is acknowledged by framework user which, likewise, will be used at the time of decryption. We are executing enhanced least significant bit supplanting steganography by utilizing genetic algorithm. At first, the quantity of bits that must be installed in a guaranteed coefficient is versatile. By utilizing proper parameters, we can get high capacity while ensuring high security. We are utilizing logistic map for shuffling of bits and utilization GA (Genetic Algorithm) to find right parameters for the logistic map. Information embedding key is utilized at the time of information embedding. By utilizing precise picture encryption and information embedding key, the beneficiary can, without much of a stretch, concentrate the incorporated secure data and totally recoup the first picture and also the original secret information. At the point when the embedding key is truant, the first picture can be recouped pretty nearly with sufficient quality without getting the embedding key of interest.

Keywords: data embedding, decryption, encryption, reversible data hiding, steganography

Procedia PDF Downloads 264
63 Overview of Pre-Analytical Lab Errors in a Tertiary Care Hospital at Rawalpindi, Pakistan

Authors: S. Saeed, T. Butt, M. Rehan, S. Khaliq

Abstract:

Objective: To determine the frequency of pre-analytical errors in samples taken from patients for various lab tests at Fauji Foundation Hospital, Rawalpindi. Material and Methods: All the lab specimens for diagnostic purposes received at the lab from Fauji Foundation hospital, Rawalpindi indoor and outdoor patients were included. Total number of samples received in the lab is recorded in the computerized program made for the hospital. All the errors observed for pre-analytical process including patient identification, sampling techniques, test collection procedures, specimen transport/processing and storage were recorded in the log book kept for the purpose. Results: A total of 476616 specimens were received in the lab during the period of study including 237931 and 238685 from outdoor and indoor patients respectively. Forty-one percent of the samples (n=197976) revealed pre-analytical discrepancies. The discrepancies included Hemolyzed samples (34.8%), Clotted blood (27.8%), Incorrect samples (17.4%), Unlabeled samples (8.9%), Insufficient specimens (3.9%), Request forms without authorized signature (2.9%), Empty containers (3.9%) and tube breakage during centrifugation (0.8%). Most of these pre-analytical discrepancies were observed in samples received from the wards revealing that inappropriate sample collection by the medical staff of the ward, as most of the outdoor samples are collected by the lab staff who are properly trained for sample collection. Conclusion: It is mandatory to educate phlebotomists and paramedical staff particularly performing duties in the wards regarding timing and techniques of sampling/appropriate container to use/early delivery of the samples to the lab to reduce pre-analytical errors.

Keywords: pre analytical lab errors, tertiary care hospital, hemolyzed, paramedical staff

Procedia PDF Downloads 183
62 Impact of Climate Change on Sea Level Rise along the Coastline of Mumbai City, India

Authors: Chakraborty Sudipta, A. R. Kambekar, Sarma Arnab

Abstract:

Sea-level rise being one of the most important impacts of anthropogenic induced climate change resulting from global warming and melting of icebergs at Arctic and Antarctic, the investigations done by various researchers both on Indian Coast and elsewhere during the last decade has been reviewed in this paper. The paper aims to ascertain the propensity of consistency of different suggested methods to predict the near-accurate future sea level rise along the coast of Mumbai. Case studies at East Coast, Southern Tip and West and South West coast of India have been reviewed. Coastal Vulnerability Index of several important international places has been compared, which matched with Intergovernmental Panel on Climate Change forecasts. The application of Geographic Information System mapping, use of remote sensing technology, both Multi Spectral Scanner and Thematic Mapping data from Landsat classified through Iterative Self-Organizing Data Analysis Technique for arriving at high, moderate and low Coastal Vulnerability Index at various important coastal cities have been observed. Instead of data driven, hindcast based forecast for Significant Wave Height, additional impact of sea level rise has been suggested. Efficacy and limitations of numerical methods vis-à-vis Artificial Neural Network has been assessed, importance of Root Mean Square error on numerical results is mentioned. Comparing between various computerized methods on forecast results obtained from MIKE 21 has been opined to be more reliable than Delft 3D model.

Keywords: climate change, Coastal Vulnerability Index, global warming, sea level rise

Procedia PDF Downloads 107
61 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment

Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu

Abstract:

Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.

Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic

Procedia PDF Downloads 451
60 Low-Cost Image Processing System for Evaluating Pavement Surface Distress

Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa

Abstract:

Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.

Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means

Procedia PDF Downloads 152
59 Analysis of Motor Nerve Conduction Velocity (MNCV) of Selected Nerves in Athletics

Authors: Jogbinder Singh Soodan, Ashok Kumar, Gobind Singh

Abstract:

Background: This study aims to describe the motor nerve conduction velocity of selected nerves of both the upper and lower extremities in athletes. Thirty high-level sprinters (100 mts and 200 mts) and thirty high level distance runners (3000 mts) were volunteered to participate in the study. Method: Motor nerve conduction velocities (MNCV) of radial and sural nerves were recorded with the help of computerized equipment, NEUROPERFECT (MEDICAID SYSTEMS, India), with standard techniques of supramaximal percutaneus stimulation. The anthropometric measurements taken were body height (cms), age (yrs) and body weight (kgs). The neurophysiological parameters taken were MNCV of radial nerve (upper extremity) and sural nerve (lower extremity) of both sides (i.e. dominant and non-dominant) of the body. The room temperature was maintained at 37 degree Celsius. Results: Significant differences in motor nerve conduction velocities were found between dominant and non-dominant limbs in each group. The MNCV of radial nerve was obtained was significantly higher in the sprinters than long distance runners. The MNCV of sural nerve recorded was significantly higher in sprinters as compared to distance runners. Conclusion: The motor nerve conduction velocity of radial nerve was found to be higher in sprinters as compared to the distance runners and also, the MNCV for sural nerve was found to be higher in sprinters as compared to distance runners. In case of sprinters, the MNCV of radial and sural nerves were higher in dominant limbs (i.e. arms and legs) of both sides of the body. But, in case of distance runners, the MNCV of radial and sural nerves is higher in non dominant limbs.

Keywords: motor nerve conduction velocity, radial nerve, sural nerve, sprinters

Procedia PDF Downloads 534
58 Demographic Characteristics and Factors Affecting Mortality in Pediatric Trauma Patients Who Are Admitted to Emergency Service

Authors: Latif Duran, Erdem Aydin, Ahmet Baydin, Ali Kemal Erenler, Iskender Aksoy

Abstract:

Aim: In this retrospective study, we aim to contribute to the literature by presenting the proposals for taking measures to reduce the mortality by examining the demographic characteristics of the pediatric age group patients presenting with trauma and the factors that may cause mortality Material and Method: This study has been performed by retrospectively investigating the data obtained from the patient files and the hospital automation registration system of the pediatric trauma patients who applied to the Adult Emergency Department of the Ondokuz Mayıs University Medical Faculty between January 1, 2016, and December 31, 2016. Results: 289 of 415 patients involved in our study, were males. The median age was 11.3 years. The most common trauma mechanism was falling from the high. A significant statistical difference was found on the association between trauma mechanisms and gender. An increase in the number of trauma cases was found especially in the summer months. The study showed that thoracic and abdominal trauma was relevant to the increased mortality. Computerized tomography was the most common diagnostic imaging modality. The presence of subarachnoid hemorrhage has increased the risk of mortality by 62.3 fold. Eight of the patients (1.9%) died. Scoring systems were statistically significant to predict mortality. Conclusion: Children are vulnerable to trauma because of their unique anatomical and physiological differences compared to adult patient groups. It will be more successful in the mortality rate and in the post-traumatic healing process by administering the patient triage fast and most appropriate trauma centers in the prehospital period, management of the critical patients with the scoring systems and management with standard treatment protocols

Keywords: emergency service, pediatric patients, scoring systems, trauma, age groups

Procedia PDF Downloads 165
57 Implementation Status of Industrial Training for Production Engineering Technology Diploma Inuniversity Kuala Lumpur Malaysia Spanish Institute (Unikl Msi)

Authors: M. Sazali Said, Rahim Jamian, Shahrizan Yusoff, Shahruzaman Sulaiman, Jum'Azulhisham Abdul Shukor

Abstract:

This case study focuses on the role of Universiti Kuala Lumpur Malaysian Spanish Institute (UniKL MSI) to produce technologist in order to reduce the shortage of skilled workers especially in the automotive industry. The purpose of the study therefore seeks to examine the effectiveness of Technical Education and Vocational Training (TEVT) curriculum of UniKL MSI to produce graduates that could immediately be productively employed by the automotive industry. The approach used in this study is through performance evaluation of students attending the Industrial Training Attachment (INTRA). The sample of study comprises of 37 students, 16 university supervisors and 26 industrial supervisors. The research methodology involves the use of quantitative and qualitative methods of data collections through the triangulation approach. The quantitative data was gathered from the students, university supervisors and industrial supervisors through the use of questionnaire. Meanwhile, the qualitative data was obtained from the students and university supervisors through the use of interview and observation. Both types of data have been processed and analyzed in order to summarize the results in terms of frequency and percentage by using a computerized spread sheet. The result shows that industrial supervisors were satisfied with the students’ performance. Meanwhile, university supervisors rated moderate effectiveness of the UniKL MSI curriculum in producing graduates with appropriate skills and in meeting the industrial needs. During the period of study, several weaknesses in the curriculum have been identified for further continuous improvements. Recommendations and suggestions for curriculum improvement also include the enhancement of technical skills and competences of students towards fulfilling the needs and demand of the automotive industries.

Keywords: technical education and vocational training (TEVT), industrial training attachment (INTRA), curriculum improvement, automotive industry

Procedia PDF Downloads 341
56 SARS-CoV-2 Transmission Risk Factors among Patients from a Metropolitan Community Health Center, Puerto Rico, July 2020 to March 2022

Authors: Juan C. Reyes, Linnette Rodríguez, Héctor Villanueva, Jorge Vázquez, Ivonne Rivera

Abstract:

On July 2020, a private non-profit community health center (HealthProMed) that serves people without a medical insurance plan or with limited resources in one of the most populated areas in San Juan, Puerto Rico, implemented a COVID-19 case investigation and contact-tracing surveillance system. Nursing personnel at the health center completed a computerized case investigation form that was translated, adapted, and modified from CDC’s Patient Under Investigation (PUI) Form. Between July 13, 2020, and March 17, 2022, a total of 9,233 SARS-CoV-2 tests were conducted at the health center, 16.9% of which were classified as confirmed cases (positive molecular test) and 27.7% as probable cases (positive serologic test). Most of the confirmed cases were females (60.0%), under 20 years old (29.1%), and living in their homes (59.1%). In the 14 days before the onset of symptoms, 26.3% of confirmed cases reported going to the supermarket, 22.4% had contact with a known COVID-19 case, and 20.7% went to work. The symptoms most commonly reported were sore throat (33.4%), runny nose (33.3%), cough (24.9%), and headache (23.2%). The most common preexisting medical conditions among confirmed cases were hypertension (19.3%), chronic lung disease including asthma, emphysema, COPD (13.3%), and diabetes mellitus (12.8). Multiple logistic regression analysis revealed that patients who used alcohol frequently during the last two weeks (OR=1.43; 95%CI: 1.15-1.77), those who were in contact with a positive case (OR=1.58; 95%CI: 1.33-1.88) and those who were obese (OR=1.82; 95%CI: 1.24-2.69) were significantly more likely to be a confirmed case after controlling for sociodemographic variables. Implementing a case investigation and contact-tracing component at community health centers can be of great value in the prevention and control of COVID-19 at the community level and could be used in future outbreaks.

Keywords: community health center, Puerto Rico, risk factors, SARS-CoV-2

Procedia PDF Downloads 80
55 Establishment of an Information Platform Increases Spontaneous Reporting of Adverse Drug Reactions

Authors: Pei-Chun Chen, Chi-Ting Tseng, Lih-Chi Chen, Kai-Hsiang Yang

Abstract:

Introduction: The pharmacist is responsible for encouraging adverse drug reaction (ADR) reporting. In a local center in Northern Taiwan, promotion and rewarding of ADR reporting have continued for over six years but failed to bring significant changes. This study aims to find a solution to increase ADR reporting. Research question or hypothesis: We hypothesized that under-reporting is due to the inconvenience of the reporting system. Reports were made conventionally through printed sheets. We proposed that reports made per month will increase if they were computerized. Study design: An ADR reporting platform was established in April 2015, before which was defined as the first stage of this study (January-March, 2015) and after which the second stage. The third stage commenced in November, 2015, after adding a reporting module to physicians prescription system. ADRs could be reported simultaneously when documenting drug allergies. Methods: ADR report rates during the three stages of the study were compared. Effects of the information platform on reporting were also analyzed. Results: During the first stage, the number of ADR reports averaged 6 per month. In the second stage, the number of reports per month averaged 1.86. Introducing the information platform had little effect on the monthly number of ADR reports. The average number of reports each month during the third stage of the study was 11±3.06, with 70.43% made electronically. Reports per month increased significantly after installing the reporting module in November, 2015 (P<0.001, t-test). In the first two stages, 29.03% of ADR reports were made by physicians, as compared to 70.42% of cases in the third stage of the study. Increased physician reporting possibly account for these differences. Conclusion: Adding a reporting module to the prescription system significantly increased ADR reporting. Improved accessibility is likely the cause. The addition of similar modules to computer systems of other healthcare professions may be considered to encourage spontaneous ADR reporting.

Keywords: adverse drug reactions, adverse drug reaction reporting systems, regional hospital, prescription system

Procedia PDF Downloads 308
54 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 54
53 The Impact of a Model's Skin Tone and Ethnic Identification on Consumer Decision Making

Authors: Shanika Y. Koreshi

Abstract:

Sri Lanka housed the lingerie product development and manufacturing subsidiary to renowned brands such as La Senza, Marks & Spencer, H&M, Etam, Lane Bryant, and George. Over the last few years, they have produced local brands such as Amante to cater to the local and regional customers. Past research has identified factors such as quality, price, and design to be vital when marketing lingerie to consumers. However, there has been minimum research that looks into the ethnically targeted market and skin colour within the Asian population. Therefore, the main aim of the research was to identify whether consumer preference for lingerie is influenced by the skin tone of the model wearing it. Moreover, the secondary aim was to investigate if the consumer preference for lingerie is influenced by the consumer’s ethnic identification with the skin tone of the model. An experimental design was used to explore the above aims. The participants constituted of 66 females residing in the western province of Sri Lanka and were gathered via convenience sampling. Six computerized images of a real model were used in the study, and her skin tone was digitally manipulated to express three different skin tones (light, tan and dark). Consumer preferences were measured through a ranking order scale that was constructed via a focus group discussion and ethnic identity was measured by the Multigroup Ethnic Identity Measure-Revised. Wilcoxon signed-rank test, Friedman test, and chi square test of independence were carried out using SPSS version 20. The results indicated that majority of the consumers ethnically identified and preferred the tan skin over the light and dark skin tones. The findings support the existing literature that states there is a preference among consumers when models have a medium skin tone over a lighter skin tone. The preference for a tan skin tone in a model is consistent with the ethnic identification of the Sri Lankan sample. The study implies that lingerie brands should consider the model's skin tones when marketing the brand to different ethnic backgrounds.

Keywords: consumer preference, ethnic identification, lingerie, skin tone

Procedia PDF Downloads 232
52 Quantification and Evaluation of Tumors Heterogeneity Utilizing Multimodality Imaging

Authors: Ramin Ghasemi Shayan, Morteza Janebifam

Abstract:

Tumors are regularly inhomogeneous. Provincial varieties in death, metabolic action, multiplication and body part are watched. There’s expanding proof that strong tumors may contain subpopulations of cells with various genotypes and phenotypes. These unmistakable populaces of malignancy cells can connect during a serious way and may contrast in affectability to medications. Most tumors show organic heterogeneity1–3 remembering heterogeneity for genomic subtypes, varieties inside the statement of development variables and genius, and hostile to angiogenic factors4–9 and varieties inside the tumoural microenvironment. These can present as contrasts between tumors in a few people. for instance, O6-methylguanine-DNA methyltransferase, a DNA fix compound, is hushed by methylation of the quality advertiser in half of glioblastoma (GBM), adding to chemosensitivity, and improved endurance. From the outset, there includes been specific enthusiasm inside the usage of dissemination weighted imaging (DWI) and dynamic complexity upgraded MRI (DCE-MRI). DWI sharpens MRI to water dispersion inside the extravascular extracellular space (EES) and is wiped out with the size and setup of the cell populace. Additionally, DCE-MRI utilizes dynamic obtaining of pictures during and after the infusion of intravenous complexity operator. Signal changes are additionally changed to outright grouping of differentiation permitting examination utilizing pharmacokinetic models. PET scan modality gives one of a kind natural particularity, permitting dynamic or static imaging of organic atoms marked with positron emanating isotopes (for example, 15O, 18F, 11C). The strategy is explained to a colossal radiation portion, which points of confinement rehashed estimations, particularly when utilized together with PC tomography (CT). At long last, it's of incredible enthusiasm to quantify territorial hemoglobin state, which could be joined with DCE-CT vascular physiology estimation to create significant experiences for understanding tumor hypoxia.

Keywords: heterogeneity, computerized tomography scan, magnetic resonance imaging, PET

Procedia PDF Downloads 104
51 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company

Authors: Rahma Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS

Procedia PDF Downloads 172
50 Evaluation of Compatibility between Produced and Injected Waters and Identification of the Causes of Well Plugging in a Southern Tunisian Oilfield

Authors: Sonia Barbouchi, Meriem Samcha

Abstract:

Scale deposition during water injection into aquifer of oil reservoirs is a serious problem experienced in the oil production industry. One of the primary causes of scale formation and injection well plugging is mixing two waters which are incompatible. Considered individually, the waters may be quite stable at system conditions and present no scale problems. However, once they are mixed, reactions between ions dissolved in the individual waters may form insoluble products. The purpose of this study is to identify the causes of well plugging in a southern Tunisian oilfield, where fresh water has been injected into the producing wells to counteract the salinity of the formation waters and inhibit the deposition of halite. X-ray diffraction (XRD) mineralogical analysis has been carried out on scale samples collected from the blocked well. Two samples collected from both formation water and injected water were analysed using inductively coupled plasma atomic emission spectroscopy, ion chromatography and other standard laboratory techniques. The results of complete waters analysis were the typical input parameters, to determine scaling tendency. Saturation indices values related to CaCO3, CaSO4, BaSO4 and SrSO4 scales were calculated for the water mixtures at different share, under various conditions of temperature, using a computerized scale prediction model. The compatibility study results showed that mixing the two waters tends to increase the probability of barite deposition. XRD analysis confirmed the compatibility study results, since it proved that the analysed deposits consisted predominantly of barite with minor galena. At the studied temperatures conditions, the tendency for barite scale is significantly increasing with the increase of fresh water share in the mixture. The future scale inhibition and removal strategies to be implemented in the concerned oilfield are being derived in a large part from the results of the present study.

Keywords: compatibility study, produced water, scaling, water injection

Procedia PDF Downloads 141
49 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 169
48 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company

Authors: Rahma Saleh Hussein Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, CMMS

Procedia PDF Downloads 93
47 The Functional Roles of Right Dorsolateral Prefrontal Cortex and Ventromedial Prefrontal Cortex in Risk-Taking Behavior

Authors: Aline M. Dantas, Alexander T. Sack, Elisabeth Bruggen, Peiran Jiao, Teresa Schuhmann

Abstract:

Risk-taking behavior has been associated with the activity of specific prefrontal regions of the brain, namely the right dorsolateral prefrontal cortex (DLPFC) and the ventromedial prefrontal cortex (VMPFC). While the deactivation of the rDLPFC has been shown to lead to increased risk-taking behavior, the functional relationship between VMPFC activity and risk-taking behavior is yet to be clarified. Correlational evidence suggests that the VMPFC is involved in valuation processes that involve risky choices, but evidence on the functional relationship is lacking. Therefore, this study uses brain stimulation to investigate the role of the VMPFC during risk-taking behavior and replicate the current findings regarding the role of the rDLPFC in this same phenomenon. We used continuous theta-burst stimulation (cTBS) to inhibit either the VMPFC or DLPFC during the execution of the computerized Maastricht Gambling Task (MGT) in a within-subject design with 30 participants. We analyzed the effects of such stimulation on risk-taking behavior, participants’ choices of probabilities and average values, and response time. We hypothesized that, compared to sham stimulation, VMPFC inhibition leads to a reduction in risk-taking behavior by reducing the appeal to higher-value options and, consequently, the attractiveness of riskier options. Right DLPFC (rDLPFC) inhibition, on the other hand, should lead to an increase in risk-taking due to a reduction in cognitive control, confirming existent findings. Stimulation of both the rDLPFC and the VMPFC led to an increase in risk-taking behavior and an increase in the average value chosen after both rDLPFC and VMPFC stimulation compared to sham. No significant effect on chosen probabilities was found. A significant increase in response time was observed exclusively after rDLPFC stimulation. Our results indicate that inhibiting DLPFC and VMPFC separately leads to similar effects, increasing both risk-taking behavior and average value choices, which is likely due to the strong anatomical and functional interconnection of the VMPFC and rDLPFC.

Keywords: decision-making, risk-taking behavior, brain stimulation, TMS

Procedia PDF Downloads 74
46 Exploring the Food Environments and Their Influence on Food Choices of Working Adults

Authors: Deepa Shokeen, Bani Tamber Aeri

Abstract:

Food environments are believed to play a significant role in the obesity epidemic and robust research methods are required to establish which factors or aspects of the food environment are relevant to food choice and to adiposity. The relationship between the food environment and obesity is complex. While there is little research linking food access with obesity as an outcome measure in any age group, with the help of this article we will try to understand the relationship between what we eat and the environmental context in which these food choices are made. Methods: A literature search of studies published between January 2000 and December 2013 was undertaken on computerized medical, social science, health, nutrition and education databases including Google, PubMed etc. Reports of organisations such as World Health Organisation (WHO), Centre for Chronic Disease Control (CCDC) were studied to project the data. Results: Studies show that food environments play a significant role in the obesity epidemic and robust research methods are required to establish which factors or aspects of the food environment are relevant to food choice and to adiposity. Evidence indicates that the food environment may help explain the obesity and cardio-metabolic risk factors among young adults. Conclusion: Cardiovascular disease is the ever growing chronic disease, the incidence of which will increase markedly in the coming decades. Therefore, it is the need of the hour to assess the prevalence of various risk factors that contribute to the incidence of cardiovascular diseases especially in the work environment. Research is required to establish how different environments affect different individuals as individuals interact with the environment on a number of levels. We need to ascertain the impact of selected food and nutrition environments (Information, organization, community, consumer) on food choice and dietary intake of the working adults as it is important to learn how these food environments influence the eating perceptions and health behaviour of the adults.

Keywords: food environment, prevalence, cardiovascular disease, India, worksite, risk factors

Procedia PDF Downloads 379
45 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 238
44 Effects of Partial Sleep Deprivation on Prefrontal Cognitive Functions in Adolescents

Authors: Nurcihan Kiris

Abstract:

Restricted sleep is common in young adults and adolescents. The results of a few objective studies of sleep deprivation on cognitive performance were not clarified. In particular, the effect of sleep deprivation on the cognitive functions associated with frontal lobe such as attention, executive functions, working memory is not well known. The aim of this study is to investigate the effect of partial sleep deprivation experimentally in adolescents on the cognitive tasks of frontal lobe including working memory, strategic thinking, simple attention, continuous attention, executive functions, and cognitive flexibility. Subjects of the study were recruited from voluntary students of Cukurova University. Eighteen adolescents underwent four consecutive nights of monitored sleep restriction (6–6.5 hr/night) and four nights of sleep extension (10–10.5 hr/night), in counterbalanced order, and separated by a washout period. Following each sleep period, cognitive performance was assessed, at a fixed morning time, using a computerized neuropsychological battery based on frontal lobe functions task, a timed test providing both accuracy and reaction time outcome measures. Only the spatial working memory performance of cognitive tasks was found to be statistically lower in a restricted sleep condition than the extended sleep condition. On the other hand, there was no significant difference in the performance of cognitive tasks evaluating simple attention, constant attention, executive functions, and cognitive flexibility. It is thought that especially the spatial working memory and strategic thinking skills of adolescents may be susceptible to sleep deprivation. On the other hand, adolescents are predicted to be optimally successful in ideal sleep conditions, especially in the circumstances requiring for the short term storage of visual information, processing of stored information, and strategic thinking. The findings of this study may also be associated with possible negative functional effects on the processing of academic social and emotional inputs in adolescents for partial sleep deprivation. Acknowledgment: This research was supported by Cukurova University Scientific Research Projects Unit.

Keywords: attention, cognitive functions, sleep deprivation, working memory

Procedia PDF Downloads 115
43 A Computerized Tool for Predicting Future Reading Abilities in Pre-Readers Children

Authors: Stephanie Ducrot, Marie Vernet, Eve Meiss, Yves Chaix

Abstract:

Learning to read is a key topic of debate today, both in terms of its implications on school failure and illiteracy and regarding what are the best teaching methods to develop. It is estimated today that four to six percent of school-age children suffer from specific developmental disorders that impair learning. The findings from people with dyslexia and typically developing readers suggest that the problems children experience in learning to read are related to the preliteracy skills that they bring with them from kindergarten. Most tools available to professionals are designed for the evaluation of child language problems. In comparison, there are very few tools for assessing the relations between visual skills and the process of learning to read. Recent literature reports that visual-motor skills and visual-spatial attention in preschoolers are important predictors of reading development — the main goal of this study aimed at improving screening for future reading difficulties in preschool children. We used a prospective, longitudinal approach where oculomotor processes (assessed with the DiagLECT test) were measured in pre-readers, and the impact of these skills on future reading development was explored. The dialect test specifically measures the online time taken to name numbers arranged irregularly in horizontal rows (horizontal time, HT), and the time taken to name numbers arranged in vertical columns (vertical time, VT). A total of 131 preschoolers took part in this study. At Time 0 (kindergarten), the mean VT, HT, errors were recorded. One year later, at Time 1, the reading level of the same children was evaluated. Firstly, this study allowed us to provide normative data for a standardized evaluation of the oculomotor skills in 5- and 6-year-old children. The data also revealed that 25% of our sample of preschoolers showed oculomotor impairments (without any clinical complaints). Finally, the results of this study assessed the validity of the DiagLECT test for predicting reading outcomes; the better a child's oculomotor skills are, the better his/her reading abilities will be.

Keywords: vision, attention, oculomotor processes, reading, preschoolers

Procedia PDF Downloads 120
42 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market

Authors: Cristian Păuna

Abstract:

In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.

Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex

Procedia PDF Downloads 103
41 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings

Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller

Abstract:

Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.

Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram

Procedia PDF Downloads 236
40 Method for Improving Antidepressants Adherence in Patients with Depressive Disorder: Systemic Review and Meta-Analysis

Authors: Juntip Kanjanasilp, Ratree Sawangjit, Kanokporn Meelap, Kwanchanok Kruthakool

Abstract:

Depression is a common mental health disorder. Antidepressants are effective pharmacological treatments, but most patients have low medication adherence. This study aims to systematic review and meta-analysis what method increase the antidepressants adherence efficiently and improve clinical outcome. Systematic review of articles of randomized controlled trials obtained by a computerized literature search of The Cochrane, Library, Pubmed, Embase, PsycINFO, CINAHL, Education search, Web of Science and ThaiLIS (28 December 2017). Twenty-three studies were included and assessed the quality of research by ROB 2.0. The results reported that printing media improved in number of people who had medication adherence statistical significantly (p= 0.018), but education, phone call, and program utilization were no different (p=0.172, p=0.127, p=0.659). There was no significant difference in pharmacist’s group, health care team’s group and physician’s group (p=0.329, p=0.070, p=0.040). Times of intervention at 1 month and 6 months improved medication adherence significantly (p= 0.0001, p=0.013). There was significantly improved adherence in single intervention (p=0.027) but no different in multiple interventions (p=0.154). When we analyzed medication adherence with the mean score, no improved adherence was found, not relevant with who gives the intervention and times to intervention. However, the multiple interventions group was statistically significant improved medication adherence (p=0.040). Phone call and the physician’s group were statistically significant improved clinical outcomes in number of improved patients (0.025 and 0.020, respectively). But in the pharmacist’s group and physician’s group were not found difference in the mean score of clinical outcomes (p=0.993, p=0.120, respectively). Times to intervention and number of intervention were not significant difference than usual care. The overall intervention can increase antidepressant adherence, especially the printing media, and the appropriate timing of the intervention is at least 6 months. For effective treatment, the provider should have experience and expert in caring for patients with depressive disorders, such as a psychiatrist. Medical personnel should have knowledge in caring for these patients also.

Keywords: depression, medication adherence, clinical outcomes, systematic review, meta-analysis

Procedia PDF Downloads 111
39 Effect of Perceived Importance of a Task in the Prospective Memory Task

Authors: Kazushige Wada, Mayuko Ueda

Abstract:

In the present study, we reanalyzed lapse errors in the last phase of a job, by re-counting near lapse errors and increasing the number of participants. We also examined the results of this study from the perspective of prospective memory (PM), which concerns future actions. This study was designed to investigate whether perceiving the importance of PM tasks caused lapse errors in the last phase of a job and to determine if such errors could be explained from the perspective of PM processing. Participants (N = 34) conducted a computerized clicking task, in which they clicked on 10 figures that they had learned in advance in 8 blocks of 10 trials. Participants were requested to click the check box in the start display of a block and to click the checking off box in the finishing display. This task was a PM task. As a measure of PM performance, we counted the number of omission errors caused by forgetting to check off in the finishing display, which was defined as a lapse error. The perceived importance was manipulated by different instructions. Half the participants in the highly important task condition were instructed that checking off was very important, because equipment would be overloaded if it were not done. The other half in the not important task condition was instructed only about the location and procedure for checking off. Furthermore, we controlled workload and the emotion of surprise to confirm the effect of demand capacity and attention. To manipulate emotions during the clicking task, we suddenly presented a photo of a traffic accident and the sound of a skidding car followed by an explosion. Workload was manipulated by requesting participants to press the 0 key in response to a beep. Results indicated too few forgetting induced lapse errors to be analyzed. However, there was a weak main effect of the perceived importance of the check task, in which the mouse moved to the “END” button before moving to the check box in the finishing display. Especially, the highly important task group showed more such near lapse errors, than the not important task group. Neither surprise, nor workload affected the occurrence of near lapse errors. These results imply that high perceived importance of PM tasks impair task performance. On the basis of the multiprocess framework of PM theory, we have suggested that PM task performance in this experiment relied not on monitoring PM tasks, but on spontaneous retrieving.

Keywords: prospective memory, perceived importance, lapse errors, multi process framework of prospective memory.

Procedia PDF Downloads 419
38 Pervasive Computing: Model to Increase Arable Crop Yield through Detection Intrusion System (IDS)

Authors: Idowu Olugbenga Adewumi, Foluke Iyabo Oluwatoyinbo

Abstract:

Presently, there are several discussions on the food security with increase in yield of arable crop throughout the world. This article, briefly present research efforts to create digital interfaces to nature, in particular to area of crop production in agriculture with increase in yield with interest on pervasive computing. The approach goes beyond the use of sensor networks for environmental monitoring but also by emphasizing the development of a system architecture that detect intruder (Intrusion Process) which reduce the yield of the farmer at the end of the planting/harvesting period. The objective of the work is to set a model for setting up the hand held or portable device for increasing the quality and quantity of arable crop. This process incorporates the use of infrared motion image sensor with security alarm system which can send a noise signal to intruder on the farm. This model of the portable image sensing device in monitoring or scaring human, rodent, birds and even pests activities will reduce post harvest loss which will increase the yield on farm. The nano intelligence technology was proposed to combat and minimize intrusion process that usually leads to low quality and quantity of produce from farm. Intranet system will be in place with wireless radio (WLAN), router, server, and client computer system or hand held device e.g PDAs or mobile phone. This approach enables the development of hybrid systems which will be effective as a security measure on farm. Since, precision agriculture has developed with the computerization of agricultural production systems and the networking of computerized control systems. In the intelligent plant production system of controlled greenhouses, information on plant responses, measured by sensors, is used to optimize the system. Further work must be carry out on modeling using pervasive computing environment to solve problems of agriculture, as the use of electronics in agriculture will attracts more youth involvement in the industry.

Keywords: pervasive computing, intrusion detection, precision agriculture, security, arable crop

Procedia PDF Downloads 376
37 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems

Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell

Abstract:

Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.

Keywords: building information modeling, BIM, facilities management systems, interoperability, information management

Procedia PDF Downloads 87
36 Efficacy and Safety of Electrical Vestibular Stimulation on Adults with Symptoms of Insomnia: A Double-Blind, Randomized, Sham-Controlled Trial

Authors: Teris Cheung, Joyce Yuen Ting Lam, Kwan Hin Fong, Calvin Pak-Wing Cheng, Julie Sittlington, Yu-Tao Xiang, Tim Man Ho Li

Abstract:

Insomnia is one of the most common health problems in the general population. Insomnia can be acute, intermittent, and become chronic, often due to comorbidity with other physical and mental health conditions. Although there are conventional pharmaceutical and psychotherapeutic treatments to treat symptoms of insomnia, however; there is no robust and novel randomized controlled trial (RCT) using transdermal neurostimulation on individuals with insomnia symptoms. This gives us the impetus to execute the first nationwide RCT. Aim: To evaluate the efficacy of Electrical Vestibular Stimulation (VeNS) on individuals with insomnia in Hong Kong. Design: This study was a two-armed, double blinded, randomized, sham-controlled trial. Sampling: 60 community-dwelling adults aged 18 and 60 years with moderate insomnia symptoms or above (Insomnia Severity Index > 14) were recruited. All subjects were computerized randomized into either the active VeNS group or the sham VeNS group on a 1:1 ratio. Intervention: All participants received a home-use VeNS device and used 30-min VeNS sessions during five consecutive days across a 4-week period (total treatment hours: 10). Baseline measurements and post-VeNS evaluation of the psychological outcomes, including 1) insomnia severity, 2) sleep quality, and 3) quality of life were investigated. The short-and long-term sustainability of the VeNS intervention was assessed immediately after poststim and at a 1-month and 3-month follow-up period. Data analysis: A mixed GEE model was used to analyze the repeated measures data. Missing data were managed by multiple imputations. The level of significance was set to p < 0.05. Significance of the study: This is the first trial to examine the efficacy and safety of VeNS among adults with insomnia symptoms in Hong Kong. Findings that emerged were used to determine whether this VeNS device can be considered a self-help technological device to reduce the severity of insomnia in the community setting and to reduce the global disease burden. Clinical Trial Registration: ClinicalTrials.gov, identifier: NCT04452981.

Keywords: adults, insomnia, neuromodulation, rct, vestibular stimulation

Procedia PDF Downloads 50