Search results for: incomplete data
25185 An Integrated Cognitive Performance Evaluation Framework for Urban Search and Rescue Applications
Authors: Antonio D. Lee, Steven X. Jiang
Abstract:
A variety of techniques and methods are available to evaluate cognitive performance in Urban Search and Rescue (USAR) applications. However, traditional cognitive performance evaluation techniques typically incorporate either the conscious or systematic aspect, failing to take into consideration the subconscious or intuitive aspect. This leads to incomplete measures and produces ineffective designs. In order to fill the gaps in past research, this study developed a theoretical framework to facilitate the integration of situation awareness (SA) and intuitive pattern recognition (IPR) to enhance the cognitive performance representation in USAR applications. This framework provides guidance to integrate both SA and IPR in order to evaluate the cognitive performance of the USAR responders. The application of this framework will help improve the system design.Keywords: cognitive performance, intuitive pattern recognition, situation awareness, urban search and rescue
Procedia PDF Downloads 32825184 Intelligent Control of Bioprocesses: A Software Application
Authors: Mihai Caramihai, Dan Vasilescu
Abstract:
The main research objective of the experimental bioprocess analyzed in this paper was to obtain large biomass quantities. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The reactor was equipped with pH, temperature, dissolved oxygen, and agitation controllers. The operating parameters were 37 oC, 1.2 atm, 250 rpm and air flow rate of 15 L/min. The main objective of this paper is to present a case study to demonstrate that intelligent control, describing the complexity of the biological process in a qualitative and subjective manner as perceived by human operator, is an efficient control strategy for this kind of bioprocesses. In order to simulate the bioprocess evolution, an intelligent control structure, based on fuzzy logic has been designed. The specific objective is to present a fuzzy control approach, based on human expert’ rules vs. a modeling approach of the cells growth based on bioprocess experimental data. The kinetic modeling may represent only a small number of bioprocesses for overall biosystem behavior while fuzzy control system (FCS) can manipulate incomplete and uncertain information about the process assuring high control performance and provides an alternative solution to non-linear control as it is closer to the real world. Due to the high degree of non-linearity and time variance of bioprocesses, the need of control mechanism arises. BIOSIM, an original developed software package, implements such a control structure. The simulation study has showed that the fuzzy technique is quite appropriate for this non-linear, time-varying system vs. the classical control method based on a priori model.Keywords: intelligent, control, fuzzy model, bioprocess optimization
Procedia PDF Downloads 32625183 A Case Study of Alkali-Silica Reaction Induced Consistent Damage and Strength Degradation Evaluation in a Textile Mill Building Due to Slow-Reactive Aggregates
Authors: Ahsan R. Khokhar, Fizza Hassan
Abstract:
Alkali-Silica Reaction (ASR) has been recognized as a potential cause of concrete degradation in the world since the 1940s. In Pakistan, mega hydropower structures like dams, weirs constructed from aggregates extracted from a local riverbed exhibited different levels of alkali-silica reactivity over an extended service period. The concrete expansion potential due to such aggregates has been categorized as slow-reactive. Apart from hydropower structures, ASR existence has been identified in the concrete structural elements of a Textile Mill building which used aggregates extracted from the nearby riverbed. The original structure of the Textile Mill was erected in the 80s with the addition of a textile ‘sizing and wrapping’ hall constructed in the 90s. In the years to follow, intensive spalling was observed in the structural members of the subject hall; enough to threat to the overall stability of the building. Limitations such as incomplete building data posed hurdles during the detailed structural investigation. The paper lists observations made while assessing the extent of damage and its effect on the building hall structure. Core testing and Petrographic tests were carried out as per the ASTM standards for strength degradation analysis followed by the identifying its root cause. Results confirmed significant structural strength reduction because of ASR which necessitated the formulation of an immediate re-strengthening solution. The paper also discusses the possible tracks of rehabilitative measures which are being adapted to stabilize the structure and seize further concrete expansion.Keywords: Alkali-Silica Reaction (ASR), concrete strength degradation, damage assessment, damage evaluation
Procedia PDF Downloads 12925182 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.Keywords: DNA microarray, feature selection, missing data, bioinformatics
Procedia PDF Downloads 57425181 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework
Authors: Lutful Karim, Mohammed S. Al-kahtani
Abstract:
Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.Keywords: big data, clustering, tree topology, data aggregation, sensor networks
Procedia PDF Downloads 34625180 A Study on the Measurement of Spatial Mismatch and the Influencing Factors of “Job-Housing” in Affordable Housing from the Perspective of Commuting
Authors: Daijun Chen
Abstract:
Affordable housing is subsidized by the government to meet the housing demand of low and middle-income urban residents in the process of urbanization and to alleviate the housing inequality caused by market-based housing reforms. It is a recognized fact that the living conditions of the insured have been improved while constructing the subsidized housing. However, the choice of affordable housing is mostly in the suburbs, where the surrounding urban functions and infrastructure are incomplete, resulting in the spatial mismatch of "jobs-housing" in affordable housing. The main reason for this problem is that the residents of affordable housing are more sensitive to the spatial location of their residence, but their selectivity and controllability to the housing location are relatively weak, which leads to higher commuting costs. Their real cost of living has not been effectively reduced. In this regard, 92 subsidized housing communities in Nanjing, China, are selected as the research sample in this paper. The residents of the affordable housing and their commuting Spatio-temporal behavior characteristics are identified based on the LBS (location-based service) data. Based on the spatial mismatch theory, spatial mismatch indicators such as commuting distance and commuting time are established to measure the spatial mismatch degree of subsidized housing in different districts of Nanjing. Furthermore, the geographically weighted regression model is used to analyze the influencing factors of the spatial mismatch of affordable housing in terms of the provision of employment opportunities, traffic accessibility and supporting service facilities by using spatial, functional and other multi-source Spatio-temporal big data. The results show that the spatial mismatch of affordable housing in Nanjing generally presents a "concentric circle" pattern of decreasing from the central urban area to the periphery. The factors affecting the spatial mismatch of affordable housing in different spatial zones are different. The main reasons are the number of enterprises within 1 km of the affordable housing district and the shortest distance to the subway station. And the low spatial mismatch is due to the diversity of services and facilities. Based on this, a spatial optimization strategy for different levels of spatial mismatch in subsidized housing is proposed. And feasible suggestions for the later site selection of subsidized housing are also provided. It hopes to avoid or mitigate the impact of "spatial mismatch," promote the "spatial adaptation" of "jobs-housing," and truly improve the overall welfare level of affordable housing residents.Keywords: affordable housing, spatial mismatch, commuting characteristics, spatial adaptation, welfare benefits
Procedia PDF Downloads 10925179 Challenges and Future Prospects of Teaching English in Secondary Schools of Jharkhand Board: An Extensive Survey of the Present Status
Authors: Neha Toppo
Abstract:
Plans and programs for successful secondary education are incomplete without the inclusion of teaching English as an important area. Even after sixteen years of the formation of Jharkhand as a separate state, the students are still struggling to achieve quality education of English. This paper intends to account the present condition of teaching English in Jharkhand board secondary level schools through discussion on various issues of English language teaching, language need and learning challenges of its students. The study is to analyze whether the learning environment, teaching methods and materials, teaching resources, goals of language curriculum are appropriately convincing for the students of the board or require to be reanalyzed and also to provide appropriate suggestions for improvement. Immediate attention must be drawn towards the problem for benefitting those students, who despite their knowledge and talent are lagging behind in numerous fields only due to the lack of proficiency in English. The data and discussion provided are on the basis of a survey, in which semi structured interview with teachers, students and administrators in several schools including both rural and urban area has been taken. Questionnaire, observation and testing were used as important tools. The survey has been conducted in Ranchi district, as it covers large geographical area which includes number of villages and at the same time several towns. The district primarily possesses tribes as well as different class of people including immigrants from all over and outside Jharkhand with their social, economical strata. The observation makes it clear that the English language teaching at the state board is not complementing its context and the whole language teaching system should be re-examined to establish learner oriented environment.Keywords: material, method, secondary level, teaching resources
Procedia PDF Downloads 56225178 External Validation of Risk Prediction Score for Candidemia in Critically Ill Patients: A Retrospective Observational Study
Authors: Nurul Mazni Abdullah, Saw Kian Cheah, Raha Abdul Rahman, Qurratu 'Aini Musthafa
Abstract:
Purpose: Candidemia was associated with high mortality in critically ill patients. Early candidemia prediction is imperative for preemptive antifungal treatment. This study aimed to externally validate the candidemia risk prediction scores by Jameran et al. (2021) by identifying risk factors of acute kidney injury, renal replacement therapy, parenteral nutrition, and multifocal candida colonization. Methods: This single-center, retrospective observational study included all critically ill patients admitted to the intensive care unit (ICU) in a tertiary referral center from January 2018 to December 2023. The study evaluated the candidemia risk prediction score performance by analyzing the occurrence of candidemia within the study period. Patients’ demographic characteristics, comorbidities, SOFA scores, and ICU outcomes were analyzed. Patients who were diagnosed with candidemia before ICU admission were excluded. Results: A total of 500 patients were analyzed with 2 dropouts due to incomplete data. Validation analysis showed that the candidemia risk prediction score has a sensitivity of 75.00% (95% CI: 59.66-86.81), specificity of 65.35% (95% CI: 60.78-69.72), positive predictive value of 17.28, and negative predictive value of 96.44. The incidence of candidemia was 8.86% with no significant differences in the demographic and comorbidities except higher SOFA scoring in the candidemia group. The candidemia group showed significantly longer ICU and hospital LOS and higher ICU and in-hospital mortality. Conclusion: This study concluded the candidemia risk prediction score by Jameran et al (2021) had good sensitivity and a high negative prediction value.Keywords: candidemia, intensive care, clinical prediction rule, incidence
Procedia PDF Downloads 825177 Topic-to-Essay Generation with Event Element Constraints
Authors: Yufen Qin
Abstract:
Topic-to-Essay generation is a challenging task in Natural language processing, which aims to generate novel, diverse, and topic-related text based on user input. Previous research has overlooked the generation of articles under the constraints of event elements, resulting in issues such as incomplete event elements and logical inconsistencies in the generated results. To fill this gap, this paper proposes an event-constrained approach for a topic-to-essay generation that enforces the completeness of event elements during the generation process. Additionally, a language model is employed to verify the logical consistency of the generated results. Experimental results demonstrate that the proposed model achieves a better BLEU-2 score and performs better than the baseline in terms of subjective evaluation on a real dataset, indicating its capability to generate higher-quality topic-related text.Keywords: event element, language model, natural language processing, topic-to-essay generation.
Procedia PDF Downloads 23625176 From the Recursive Definition of Refutability to the Invalidity of Gödel’s 1931 Incompleteness
Authors: Paola Cattabriga
Abstract:
According to Gödel’s first incompleteness argument it is possible to construct a formally undecidable proposition in Principia mathematica, a statement that, although true, turns out to be neither provable nor refutable for the system, making therefore incomplete any formal system suitable for the arithmetic of integers. Its features and limitation effects are today widespread basics throughout whole scientific thought. This article brings Gödel’s achievement into question by the definition of the refutability predicate as a number-theoretical statement. We develop proof of invalidity of Theorem VI in Gödel’s 1931, the so-called Gödel’s first incompleteness theorem, in two steps: defining refutability within the same recursive status as provability and showing that as a consequence propositions (15) and (16), derived from definition 8.1 in Gödel’s 1931, are false and unacceptable for the system. The achievement of their falsity blocks the derivation of Theorem VI, which turns out to be therefore invalid, together with all the depending theorems. This article opens up thus new perspectives for mathematical research and for the overall scientific reasoning.Keywords: Gödel numbering, incompleteness, provability predicate, refutability predicate
Procedia PDF Downloads 18825175 Generating Music with More Refined Emotions
Authors: Shao-Di Feng, Von-Wun Soo
Abstract:
To generate symbolic music with specific emotions is a challenging task due to symbolic music datasets that have emotion labels are scarce and incomplete. This research aims to generate more refined emotions based on the training datasets that are only labeled with four quadrants in Russel’s 2D emotion model. We focus on the theory of Music Fadernet and map arousal and valence to the low-level attributes, and build a symbolic music generation model by combining transformer and GM-VAE. We adopt an in-attention mechanism for the model and improve it by allowing modulation by conditional information. And we show the music generation model could control the generation of music according to the emotions specified by users in terms of high-level linguistic expression and by manipulating their corresponding low-level musical attributes. Finally, we evaluate the model performance using a pre-trained emotion classifier against a pop piano midi dataset called EMOPIA, and by subjective listening evaluation, we demonstrate that the model could generate music with more refined emotions correctly.Keywords: music generation, music emotion controlling, deep learning, semi-supervised learning
Procedia PDF Downloads 8925174 Research on the Construction of Fair Use of Copyright and Compensation System for Artificial Intelligence Creation
Authors: Shen Xiaoyun
Abstract:
The AI-generated works must intersect with the right holder’s work, thus having a certain impact on the rights and interests of the right holder’s work. The law needs to explore and improve the regulation of the fair use of AI creations and build a compensation system to adapt to the development of the times. The development of AI technology has brought about problems such as the unclear relationship between fair use and infringement of copyright, the unclear general terms and conditions of application, and the incomplete criteria for judging at different stages. Through different theoretical methods, the legitimacy of the rational use of the system can be demonstrated. The compensation standard for fair use of copyright in AI creation can refer to the market pricing of the right holder's work, and the compensation can construct a formula for the amount of damages for AI copyright infringement, and construct the compensation standard based on the main factors affecting the market value of the work, so as to provide a reference for the construction of a compensation system for fair use of works generated by AI.Keywords: artificial intelligence, creative acts, fair use of copyright, copyright compensation system
Procedia PDF Downloads 2325173 Long-Term Cohort of Patients with Beta Thalassemia; Prevailing Role of Serum Ferritin Levels in Hypocalcemia and Growth Retardation
Authors: Shervin Rashidinia, Sara Shahmoradi, Seyyed Shahin Eftekhari, Mohsen Talebizadeh, Mohammad Saleh Sadeghi
Abstract:
Background: Beta-thalassemia Major (BTM) is a kind of hereditary hemolytic anemia which depended on regular monthly blood transfusion. However, iron deposition into the organs leads to multi-organ damage. The present study is the first study which aimed to evaluate the average of five-years serum ferritin level and compared by the prevalence of short stature and hypocalcemia. Materials/Methods: A cross-sectional retrospective study which a total of 140 patients with beta-thalassemia who were referred to Qom Thalassemia Clinic between February 2011 and July 2016 were enrolled to be reviewed. The exclusion criteria were consisting of incomplete medical records, diagnosis less than 2-years-ago and the blood transfusion less than every 4 weeks. The data including age, gender, weight, height, age of initial blood transfusion, age of initial chelation therapy, ferritin, and calcium were collected and analysis by SPSS version 24. Results: A total of 140 patients were enrolled. Of them, 75 (53.4%) were female. The mean age of the patients was 13.4±4.6 years.The mean age of initial diagnosis was 20.2±7.4 months. Hypocalcemia and short stature were occurred in 41 (29.3%) and 37 (26.4%) patients, respectively. The mean five-years serum ferritin level was significantly higher in the patients with short stature and hypocalcemia (P<0.0001). However, rise in serum ferritin level significantly increases the risk of short-stature and hypocalcemia (1.0004- and 1.0029 fold, respectively). Conclusion: We demonstrated that prevalence of short stature and hypocalcemia were significantly higher in the BTM.However, ferritin significantly increases the risk of short stature and hypocalcemia.Keywords: beta-thalassemia, ferritin, growth retardation, hypocalcemia
Procedia PDF Downloads 32825172 Control the Flow of Big Data
Authors: Shizra Waris, Saleem Akhtar
Abstract:
Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.Keywords: computer, it community, industry, big data
Procedia PDF Downloads 19425171 Evaluating the Implementation of a Quality Management System in the COVID-19 Diagnostic Laboratory of a Tertiary Care Hospital in Delhi
Authors: Sukriti Sabharwal, Sonali Bhattar, Shikhar Saxena
Abstract:
Introduction: COVID-19 molecular diagnostic laboratory is the cornerstone of the COVID-19 disease diagnosis as the patient’s treatment and management protocol depend on the molecular results. For this purpose, it is extremely important that the laboratory conducting these results adheres to the quality management processes to increase the accuracy and validity of the reports generated. We started our own molecular diagnostic setup at the onset of the pandemic. Therefore, we conducted this study to generate our quality management data to help us in improving on our weak points. Materials and Methods: A total of 14561 samples were evaluated by the retrospective observational method. The quality variables analysed were classified into pre-analytical, analytical, and post-analytical variables, and the results were presented in percentages. Results: Among the pre-analytical variables, sample leaking was the most common cause of the rejection of samples (134/14561, 0.92%), followed by non-generation of SRF ID (76/14561, 0.52%) and non-compliance to triple packaging (44/14561, 0.3%). The other pre-analytical aspects assessed were incomplete patient identification (17/14561, 0.11%), insufficient quantity of samples (12/14561, 0.08%), missing forms/samples (7/14561, 0.04%), samples in the wrong vials/empty VTM tubes (5/14561, 0.03%) and LIMS entry not done (2/14561, 0.01%). We are unable to obtain internal quality control in 0.37% of samples (55/14561). We also experienced two incidences of cross-contamination among the samples resulting in false-positive results. Among the post-analytical factors, a total of 0.07% of samples (11/14561) could not be dispatched within the stipulated time frame. Conclusion: Adherence to quality control processes is foremost for the smooth running of any diagnostic laboratory, especially the ones involved in critical reporting. Not only do the indicators help in keeping in check the laboratory parameters but they also allow comparison with other laboratories.Keywords: laboratory quality management, COVID-19, molecular diagnostics, healthcare
Procedia PDF Downloads 16325170 Investigation into Shopping Tourist Satisfaction: An Application of Shopping Values
Authors: Miju Choi
Abstract:
Shopping tourism is an emerging concept in tourism research, thus contradicting the notion that shopping is not a novel idea. Tourists have long been performing shopping activities, such as purchasing authentic handicrafts and souvenirs, to benefit from a pleasant tourism experience. Some scholars regarded shopping as one of the oldest tourist activities and stressed that a trip is incomplete without shopping. Others then asserted that shopping is inseparable from other activities in tourist destinations and may in fact be considered a main purpose for travel. In other words, shopping is regarded as an incidental tourist activity, thereby indicating its potential as a primary travel motivation. The current study investigates the personal values of shopping tourists and their satisfaction levels. Via convenience sampling, 230 samples were collected. The software packages SPSS Statistics 20.0 and AMOS 20.0 were used for statistical analysis. Findings showed that both hedonic and utilitarian values positively influence tourist satisfaction and positive word of mouth. Therefore, this research deepens understanding regarding tourist behavior in the context of shopping tourism research.Keywords: shopping tourism, hedonic value, utilitarian value, tourist satisfaction
Procedia PDF Downloads 43325169 High Performance Computing and Big Data Analytics
Authors: Branci Sarra, Branci Saadia
Abstract:
Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.Keywords: high performance computing, HPC, big data, data analysis
Procedia PDF Downloads 52025168 A Landscape of Research Data Repositories in Re3data.org Registry: A Case Study of Indian Repositories
Authors: Prashant Shrivastava
Abstract:
The purpose of this study is to explore re3dat.org registry to identify research data repositories registration workflow process. Further objective is to depict a graph for present development of research data repositories in India. Preliminarily with an approach to understand re3data.org registry framework and schema design then further proceed to explore the status of research data repositories of India in re3data.org registry. Research data repositories are getting wider relevance due to e-research concepts. Now available registry re3data.org is a good tool for users and researchers to identify appropriate research data repositories as per their research requirements. In Indian environment, a compatible National Research Data Policy is the need of the time to boost the management of research data. Registry for Research Data Repositories is a crucial tool to discover specific information in specific domain. Also, Research Data Repositories in India have not been studied. Re3data.org registry and status of Indian research data repositories both discussed in this study.Keywords: research data, research data repositories, research data registry, re3data.org
Procedia PDF Downloads 32425167 Effect of Globalization on Flow Performance in Godean Jathilan Pranesa Yogyakarta
Authors: Maria Armalita Tumimbang
Abstract:
Jathilan or Kuda Lumping is a dance-drama with warfare as the main theme and the dancers mimicking mighty horsemen armed with sword in the middle of the battle field. However, to most people this dance-drama is more identical with magical nuanced dance and trance, beside the attractive and even dangerous acts of the dancers, such as eating shard or broken glass in a state of trance. Several music players play the accompaniment made up of incomplete gamelan set that include saron, kendang, gong, and kempul. In general, it remains unchanged with regards to the seemingly monotonous beat and occasional “bumps” that may lead the dancers into a trance state. The dances performed also tend to be of repetitive patterns. The development of Jathilan and other traditional art performance in this globalization and industrialization era can be divided into two: firstly, they are subjected to the power of industrialization, which means their performances are to be recorded for commercial purpose, and secondly, they are to be presented in live performances. To some people, live performances are preferable, and for some reasons, they represent a form of cultural résistance to globalization and industrialization. The present study is qualitative in nature. It aims to describe the music and performance of Jathilan in the era of globalization in Indonesia. The subject of this study is a traditional art group, Jathilan Kuda Pranesa of Godean, Yogyakarta. Data collection was conducted by interviews with the leader of the group, the dancers and music players, as well as the audience. The wave of globalization has brought strong capitalistic industrialization that render traditional arts simply into industrial commodities tailored to the need of the era. This very fact has made the repositioning of traditional art performance of Jathilan a necessity. And by repositioning we mean that Jathilans should be put back to their traditional forms and functions as they used to be.Keywords: Jathilan, globalization, industrialization, music, performance
Procedia PDF Downloads 30625166 A Study of Cloud Computing Solution for Transportation Big Data Processing
Authors: Ilgin Gökaşar, Saman Ghaffarian
Abstract:
The need for fast processed big data of transportation ridership (eg., smartcard data) and traffic operation (e.g., traffic detectors data) which requires a lot of computational power is incontrovertible in Intelligent Transportation Systems. Nowadays cloud computing is one of the important subjects and popular information technology solution for data processing. It enables users to process enormous measure of data without having their own particular computing power. Thus, it can also be a good selection for transportation big data processing as well. This paper intends to examine how the cloud computing can enhance transportation big data process with contrasting its advantages and disadvantages, and discussing cloud computing features.Keywords: big data, cloud computing, Intelligent Transportation Systems, ITS, traffic data processing
Procedia PDF Downloads 46825165 Harmonic Data Preparation for Clustering and Classification
Authors: Ali Asheibi
Abstract:
The rapid increase in the size of databases required to store power quality monitoring data has demanded new techniques for analysing and understanding the data. One suggested technique to assist in analysis is data mining. Preparing raw data to be ready for data mining exploration take up most of the effort and time spent in the whole data mining process. Clustering is an important technique in data mining and machine learning in which underlying and meaningful groups of data are discovered. Large amounts of harmonic data have been collected from an actual harmonic monitoring system in a distribution system in Australia for three years. This amount of acquired data makes it difficult to identify operational events that significantly impact the harmonics generated on the system. In this paper, harmonic data preparation processes to better understanding of the data have been presented. Underlying classes in this data has then been identified using clustering technique based on the Minimum Message Length (MML) method. The underlying operational information contained within the clusters can be rapidly visualised by the engineers. The C5.0 algorithm was used for classification and interpretation of the generated clusters.Keywords: data mining, harmonic data, clustering, classification
Procedia PDF Downloads 24825164 Linguistic Summarization of Structured Patent Data
Authors: E. Y. Igde, S. Aydogan, F. E. Boran, D. Akay
Abstract:
Patent data have an increasingly important role in economic growth, innovation, technical advantages and business strategies and even in countries competitions. Analyzing of patent data is crucial since patents cover large part of all technological information of the world. In this paper, we have used the linguistic summarization technique to prove the validity of the hypotheses related to patent data stated in the literature.Keywords: data mining, fuzzy sets, linguistic summarization, patent data
Procedia PDF Downloads 27225163 Proposal of Data Collection from Probes
Authors: M. Kebisek, L. Spendla, M. Kopcek, T. Skulavik
Abstract:
In our paper we describe the security capabilities of data collection. Data are collected with probes located in the near and distant surroundings of the company. Considering the numerous obstacles e.g. forests, hills, urban areas, the data collection is realized in several ways. The collection of data uses connection via wireless communication, LAN network, GSM network and in certain areas data are collected by using vehicles. In order to ensure the connection to the server most of the probes have ability to communicate in several ways. Collected data are archived and subsequently used in supervisory applications. To ensure the collection of the required data, it is necessary to propose algorithms that will allow the probes to select suitable communication channel.Keywords: communication, computer network, data collection, probe
Procedia PDF Downloads 36025162 Effects of Using a Recurrent Adverse Drug Reaction Prevention Program on Safe Use of Medicine among Patients Receiving Services at the Accident and Emergency Department of Songkhla Hospital Thailand
Authors: Thippharat Wongsilarat, Parichat tuntilanon, Chonlakan Prataksitorn
Abstract:
Recurrent adverse drug reactions are harmful to patients with mild to fatal illnesses, and affect not only patients but also their relatives, and organizations. To compare safe use of medicine among patients before and after using the recurrent adverse drug reaction prevention program . Quasi-experimental research with the target population of 598 patients with drug allergy history. Data were collected through an observation form tested for its validity by three experts (IOC = 0.87), and analyzed with a descriptive statistic (percentage). The research was conducted jointly with a multidisciplinary team to analyze and determine the weak points and strong points in the recurrent adverse drug reaction prevention system during the past three years, and 546, 329, and 498 incidences, respectively, were found. Of these, 379, 279, and 302 incidences, or 69.4; 84.80; and 60.64 percent of the patients with drug allergy history, respectively, were found to have caused by incomplete warning system. In addition, differences in practice in caring for patients with drug allergy history were found that did not cover all the steps of the patient care process, especially a lack of repeated checking, and a lack of communication between the multidisciplinary team members. Therefore, the recurrent adverse drug reaction prevention program was developed with complete warning points in the information technology system, the repeated checking step, and communication among related multidisciplinary team members starting from the hospital identity card room, patient history recording officers, nurses, physicians who prescribe the drugs, and pharmacists. Including in the system were surveillance, nursing, recording, and linking the data to referring units. There were also training concerning adverse drug reactions by pharmacists, monthly meetings to explain the process to practice personnel, creating safety culture, random checking of practice, motivational encouragement, supervising, controlling, following up, and evaluating the practice. The rate of prescribing drugs to which patients were allergic per 1,000 prescriptions was 0.08, and the incidence rate of recurrent drug reaction per 1,000 prescriptions was 0. Surveillance of recurrent adverse drug reactions covering all service providing points can ensure safe use of medicine for patients.Keywords: recurrent drug, adverse reaction, safety, use of medicine
Procedia PDF Downloads 45625161 A Review on Big Data Movement with Different Approaches
Authors: Nay Myo Sandar
Abstract:
With the growth of technologies and applications, a large amount of data has been producing at increasing rate from various resources such as social media networks, sensor devices, and other information serving devices. This large collection of massive, complex and exponential growth of dataset is called big data. The traditional database systems cannot store and process such data due to large and complexity. Consequently, cloud computing is a potential solution for data storage and processing since it can provide a pool of resources for servers and storage. However, moving large amount of data to and from is a challenging issue since it can encounter a high latency due to large data size. With respect to big data movement problem, this paper reviews the literature of previous works, discusses about research issues, finds out approaches for dealing with big data movement problem.Keywords: Big Data, Cloud Computing, Big Data Movement, Network Techniques
Procedia PDF Downloads 8625160 Optimized Approach for Secure Data Sharing in Distributed Database
Authors: Ahmed Mateen, Zhu Qingsheng, Ahmad Bilal
Abstract:
In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.Keywords: ER-schema, electronic record, P2P framework, API, query formulation
Procedia PDF Downloads 33325159 Effect of Internet Addiction on Dietary Behavior and Lifestyle Characteristics among University Students
Authors: Hafsa Kamran, Asma Afreen, Zaheer Ahmed
Abstract:
Internet addiction, an emerging mental health disorder from last two decades, is manifested by the inability in the controlled use of internet leading to academics, social, physiological and/or psychological difficulties. The present study aimed to assess the levels of internet addiction among university students in Lahore and to explore the effects of internet addiction on their dietary behavior and lifestyle. It was an analytical cross-sectional study. Data was collected from October to December 2016 from students of four universities selected through two-stage sampling method. The numbers of participants were 500 and 13 questionnaires were rejected due to incomplete information. Levels of Internet Addiction (IA) were calculated using Young Internet Addiction Test (YIAT). Data was also collected on students’ demographics, lifestyle factors and dietary behavior using self-reported questionnaire. Data was analyzed using SPSS (version 21). Chi-square test was applied to evaluate the relationship between variables. Results of the study revealed that 10% of the population had severe internet addiction while moderate Internet Addiction was present in 42%. High prevalence was found among males (11% vs. 8%), private sector university students (p = 0.008) and engineering students (p = 0.000). The lifestyle habits of internet addicts were significantly of poorer quality than normal users (p = 0.05). Internet addiction was found associated with lesser physically activity (p = 0.025), had shorter duration of physical activity (p = 0.016), had more disorganized sleep pattern (p = 0.023), had less duration of sleep (p = 0.019), reported being more tired and sleepy in class (p = 0.033) and spending more time on internet as compared to normal users. Severe and moderate internet addicts also found to be more overweight and obese than normal users (p = 0.000). The dietary behavior of internet addicts was significantly poorer than normal users. Internet addicts were found to skip breakfast more than a normal user (p = 0.039). Common reasons for meal skipping were lack of time and snacking between meals (p = 0.000). They also had increased meal size (p = 0.05) and habit of snacking while using the internet (p = 0.027). Fast food (p = 0.016) and fried items (p = 0.05) were most consumed snacks, while carbonated beverages (p = 0.019) were most consumed beverages among internet addicts. Internet Addicts were found to consume less than recommended daily servings of dairy (p = 0.008) and fruits (p = 0.000) and more servings of meat group (p = 0.025) than their no internet addict counterparts. In conclusion, in this study, it was demonstrated that internet addicts have unhealthy dietary behavior and inappropriate lifestyle habits. University students should be educated regarding the importance of balanced diet and healthy lifestyle, which are critical for effectual primary prevention of numerous chronic degenerative diseases. Furthermore, it is necessary to raise awareness concerning adverse effects of internet addiction among youth and their parents.Keywords: dietary behavior, internet addiction, lifestyle, university students
Procedia PDF Downloads 20125158 Valorization of Surveillance Data and Assessment of the Sensitivity of a Surveillance System for an Infectious Disease Using a Capture-Recapture Model
Authors: Jean-Philippe Amat, Timothée Vergne, Aymeric Hans, Bénédicte Ferry, Pascal Hendrikx, Jackie Tapprest, Barbara Dufour, Agnès Leblond
Abstract:
The surveillance of infectious diseases is necessary to describe their occurrence and help the planning, implementation and evaluation of risk mitigation activities. However, the exact number of detected cases may remain unknown whether surveillance is based on serological tests because identifying seroconversion may be difficult. Moreover, incomplete detection of cases or outbreaks is a recurrent issue in the field of disease surveillance. This study addresses these two issues. Using a viral animal disease as an example (equine viral arteritis), the goals were to establish suitable rules for identifying seroconversion in order to estimate the number of cases and outbreaks detected by a surveillance system in France between 2006 and 2013, and to assess the sensitivity of this system by estimating the total number of outbreaks that occurred during this period (including unreported outbreaks) using a capture-recapture model. Data from horses which exhibited at least one positive result in serology using viral neutralization test between 2006 and 2013 were used for analysis (n=1,645). Data consisted of the annual antibody titers and the location of the subjects (towns). A consensus among multidisciplinary experts (specialists in the disease and its laboratory diagnosis, epidemiologists) was reached to consider seroconversion as a change in antibody titer from negative to at least 32 or as a three-fold or greater increase. The number of seroconversions was counted for each town and modeled using a unilist zero-truncated binomial (ZTB) capture-recapture model with R software. The binomial denominator was the number of horses tested in each infected town. Using the defined rules, 239 cases located in 177 towns (outbreaks) were identified from 2006 to 2013. Subsequently, the sensitivity of the surveillance system was estimated as the ratio of the number of detected outbreaks to the total number of outbreaks that occurred (including unreported outbreaks) estimated using the ZTB model. The total number of outbreaks was estimated at 215 (95% credible interval CrI95%: 195-249) and the surveillance sensitivity at 82% (CrI95%: 71-91). The rules proposed for identifying seroconversion may serve future research. Such rules, adjusted to the local environment, could conceivably be applied in other countries with surveillance programs dedicated to this disease. More generally, defining ad hoc algorithms for interpreting the antibody titer could be useful regarding other human and animal diseases and zoonosis when there is a lack of accurate information in the literature about the serological response in naturally infected subjects. This study shows how capture-recapture methods may help to estimate the sensitivity of an imperfect surveillance system and to valorize surveillance data. The sensitivity of the surveillance system of equine viral arteritis is relatively high and supports its relevance to prevent the disease spreading.Keywords: Bayesian inference, capture-recapture, epidemiology, equine viral arteritis, infectious disease, seroconversion, surveillance
Procedia PDF Downloads 29725157 A Review of Current Practices in Tattooing of Colonic Lesion at Endoscopy
Authors: Dhanashree Moghe, Roberta Bullingham, Rizwan Ahmed, Tarun Singhal
Abstract:
Aim: The NHS Bowel Screening Programme recommends the use of endoscopic tattooing for suspected malignant lesions that later require surgical or endoscopic localisation, using local protocols as guidance. This is in accordance with guidance from the BSG (The British Society of Gastroenterologists). We used a well-recognised local protocol as a standard to audit current tattooing practice in a large district general hospital with no current local guidelines. Method: A retrospective quantitative analysis of 50 patients who underwent segmental colonic resection for cancer over a 6-month period in 2021. We reviewed historic electronic endoscopy reports recording relevant data on tattoo indication and placement. Secondly, we carried out an anonymous survey of 16 independent lower GI endoscopists on self-reported details of their practice. Results: In our study, 28 patients (56%) had a tattoo placed at the time of their colonoscopy. Of these, only 53% (n=15) had the tattoo distal to the lesion, with the measured distance of the tattoo from the lesion only being documented in 8 reports. Only seven patients (25%) had a circumferential (4 quadrant) placement of the tattoo. 13 patients had lesions either in the caecum or rectum, locations deemed unnecessary as per BSG guidelines. Of the survey responses collected, there were four different protocols being used to guide practice. Only 50% of respondents placed tattoos at the correct distance from the lesion, and 83% placed the correct number of tattoos. Conclusion: There is a lack of standardisation of practices in colonic tattooing demonstrated in our study with incomplete compliance to our standard. The inadequate documentation of tattoo location can contribute to confusion and inaccuracy in the intraoperative localisation of lesions. This has the potential to increase operation length and morbidity. There is a need to standardise both technique and documentation in colonoscopic tattooing practice.Keywords: colorectal cancer, endoscopic tattooing, colonoscopy, NHS BSCP
Procedia PDF Downloads 12025156 Cadaveric Assessment of Kidney Dimensions Among Nigerians - A Preliminary Report
Authors: Rotimi Sunday Ajani, Omowumi Femi-Akinlosotu
Abstract:
Background: The usually paired human kidneys are retroperitoneal urinary organs with some endocrine functions. Standard text books of anatomy ascribe single value to each of the dimension of length, width and thickness. Research questions: These values do not give consideration to racial and genetic variability in human morphology. They may thus be erroneous to students and clinicians working on Nigerians. Objectives: The study aimed at establishing reference values of the kidney length, width and thickness for Nigerians using the cadaveric model. Methodology: The length, width, thickness and weight of sixty kidneys harvested from cadavers of thirty adult Nigerians (Male: Female; 27: 3) were measured. Respective volume was calculated using the ellipsoid formula. Results: The mean length of the kidney was 9.84±0.89 cm (9.63±0.88 {right}; 10.06±0.86 {left}), width- 5.18±0.70 cm (5.21±0.72 {right}; 5.14±0.70 {left}), thickness-3.45±0.56 cm (3.36±0.58 {right}, 3.53±0.55 {left}), weight-125.06±22.34 g (122.36±21.70 {right}; 127.76 ±24.02 {left}) and volume of 95.45± 24.40 cm3 (91.73± 26.84 {right}; 99.17± 25.75 {left}). Discussion: Though the values of the parameters measured were higher for the left kidney (except for the width), they were not statistically significant. The various parameters obtained by this study differ from those of similar studies from other continents. Conclusion: Stating single value for each of the parameter of length, width and thickness of the kidney as currently obtained in textbooks of anatomy may be incomplete information and hence misleading. Thus, there is the need to emphasize racial differences when stating the normal values of kidney dimensions in textbooks of anatomy. Implication for Research and Innovation: The results of the study showed the dimensions of the kidney (length, width and thickness) have interracial vagaries as they were different from those of similar studies and values stated in standard textbooks of human anatomy. Future direction: This is a preliminary report and the study will continue so that more data will be obtained.Keywords: kidney dimensions, cadaveric estimation, adult nigerians, racial differences
Procedia PDF Downloads 99