Search results for: data standardization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25247

Search results for: data standardization

25127 Challenges influencing Nurse Initiated Management of Retroviral Therapy (NIMART) Implementation in Ngaka Modiri Molema District, North West Province, South Africa

Authors: Sheillah Hlamalani Mboweni, Lufuno Makhado

Abstract:

Background: The increasing number of people who tested HIV positive and who demand antiretroviral therapy (ART) prompted the National Department of Health to adopt WHO recommendations of task shifting where Professional Nurses(PNs) initiate ART rather than doctors in the hospital. This resulted in the decentralization of services to primary health care(PHC), generating a need to capacitate PNs on NIMART. After years of training, the impact of NIMART was assessed where it was established that even though there was an increased number who accessed ART, the quality of care is of serious concern. The study aims to answer the following question: What are the challenges influencing NIMART implementation in primary health care. Objectives: This study explores challenges influencing NIMART training and implementation and makes recommendations to improve patient and HIV program outcomes. Methods: A qualitative explorative program evaluation research design. The study was conducted in the rural districts of North West province. Purposive sampling was used to sample PNs trained on NIMART. FGDs were used to collect data with 6-9 participants and data was analysed using ATLAS ti. Results: Five FGDs, n=28 PNs and three program managers were interviewed. The study results revealed two themes: inadequacy in NIMART training and the health care system challenges. Conclusion: The deficiency in NIMART training and health care system challenges is a public health concern as it compromises the quality of HIV management resulting in poor patients’ outcomes and retard the goal of ending the HIV epidemic. These should be dealt with decisively by all stakeholders. Recommendations: The national department of health should improve NIMART training and HIV management: standardization of NIMART training curriculum through the involvement of all relevant stakeholders skilled facilitators, the introduction of pre-service NIMART training in institutions of higher learning, support of PNs by district and program managers, plan on how to deal with the shortage of staff, negative attitude to ensure compliance to guidelines. There is a need to develop a conceptual framework that provides guidance and strengthens NIMART implementation in PHC facilities.

Keywords: antiretroviral therapy, nurse initiated management of retroviral therapy, primary health care, professional nurses

Procedia PDF Downloads 158
25126 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 574
25125 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework

Authors: Lutful Karim, Mohammed S. Al-kahtani

Abstract:

Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.

Keywords: big data, clustering, tree topology, data aggregation, sensor networks

Procedia PDF Downloads 345
25124 Impact Analysis of a School-Based Oral Health Program in Brazil

Authors: Fabio L. Vieira, Micaelle F. C. Lemos, Luciano C. Lemos, Rafaela S. Oliveira, Ian A. Cunha

Abstract:

Brazil has some challenges ahead related to population oral health, most of them associated with the need of expanding into the local level its promotion and prevention activities, offer equal access to services and promote changes in the lifestyle of the population. The program implemented an oral health initiative in public schools in the city of Salvador, Bahia. The mission was to improve oral health among students on primary and secondary education, from 2 to 15 years old, using the school as a pathway to increase access to healthcare. The main actions consisted of a team's visit to the schools with educational sessions for dental cavity prevention and individual assessment. The program incorporated a clinical surveillance component through a dental evaluation of every student searching for dental disease and caries, standardization of the dentists’ team to reach uniform classification on the assessments, and the use of an online platform to register data directly from the schools. Sequentially, the students with caries were referred for free clinical treatment on the program’s Health Centre. The primary purpose of this study was to analyze the effects and outcomes of this school-based oral health program. The study sample was composed by data of a period of 3 years - 2015 to 2017 - from 13 public schools on the suburb of the city of Salvador with a total number of assessments of 9,278 on this period. From the data collected the prevalence of children with decay on permanent teeth was chosen as the most reliable indicator. The prevalence was calculated for each one of the 13 schools using the number of children with 1 or more dental caries on permanent teeth divided by the total number of students assessed for school each year. Then the percentage change per year was calculated for each school. Some schools presented a higher variation on the total number of assessments in one of the three years, so for these, the percentage change calculation was done using the two years with less variation. The results show that 10 of the 13 schools presented significative improvements for the indicator of caries in permanent teeth. The mean for the number of students with caries percentage reduction on the 13 schools was 26.8%, and the median was 32.2% caries in permanent teeth institution. The highest percentage of improvement reached a decrease of 65.6% on the indicator. Three schools presented a rise in caries prevalence (8.9, 18.9 and 37.2% increase) that, on an initial analysis, seems to be explained with the students’ cohort rotation among other schools, as well as absenteeism on the treatment. In conclusion, the program shows a relevant impact on the reduction of caries in permanent teeth among students and the need for the continuity and expansion of this integrated healthcare approach. It has also been evident the significative of the articulation between health and educational systems representing a fundamental approach to improve healthcare access for children especially in scenarios such as presented in Brazil.

Keywords: primary care, public health, oral health, school-based oral health, data management

Procedia PDF Downloads 134
25123 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 297
25122 Revisiting the Donning and Doffing Procedure: Ensuring a Coordinated Practice

Authors: Deanna Ruano-Meas, Laura Shenkman

Abstract:

Variances are seen in the way healthcare personnel (HCP) don and doff PPE risking contamination to self and others. By standardizing practice, variances in technique decrease, and so does the risk of contamination. To implement this change, the Model for Improvement will be used. A system change will be developed that will outline the role of the organizational leader’s support of HCP in the proper donning and doffing of PPE. Interventions will include environmental surveys to assess the safety and work situation ensuring a permissible environment, plan audits to confirm consistency, and the assessment of PPE wear for standardization. The change will also include an educational plan that will involve instruction of the current guidelines recommended by the Centers for Disease Control and Prevention (CDC) to all pertinent HCP, and the incorporation of PPE education in yearly educational training. The goal is a standardized practice and a reduced risk of contamination through education and organizational support. Personal protective equipment has had recent attention with the coming of the SARS-CoV-2. The realization that proper technique is important to decreasing contamination of pathogens has led to the revising of current processes.

Keywords: donning and doffing, HAI, infection control, PPE

Procedia PDF Downloads 204
25121 Control the Flow of Big Data

Authors: Shizra Waris, Saleem Akhtar

Abstract:

Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.

Keywords: computer, it community, industry, big data

Procedia PDF Downloads 194
25120 High Performance Computing and Big Data Analytics

Authors: Branci Sarra, Branci Saadia

Abstract:

Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.

Keywords: high performance computing, HPC, big data, data analysis

Procedia PDF Downloads 520
25119 Basic Examination of Easily Distinguishable Tactile Symbols Attached to Containers and Packaging

Authors: T. Nishimura, K. Doi, H. Fujimoto, Y. Hoshikawa, T. Wada

Abstract:

In Japan, it is expected that reasonable accommodation for persons with disabilities will progress further. In particular, there is an urgent need to enhance information support for visually impaired persons who have difficulty accessing information. Recently, tactile symbols have been attached to various surfaces, such as the content labels of containers and packaging of various everyday products. The advantage of tactile symbols is that they are useful for visually impaired persons who cannot read Braille. The method of displaying tactile symbols is prescribed by the International Organization for Standardization (ISO). However, the quantitative data on the shapes and dimensions of tactile symbols is insufficient. In this study, through an evaluation experiments, we examine the easy-to-distinguish shapes and dimensions of tactile symbols used for various applications, including the content labels on containers and packaging. Visually impaired persons participated in the experiments. They used tactile symbols on a daily basis. The details and processes of the experiments were orally explained to the participants prior to the experiments, and the informed consent of the participants was obtained. They were instructed to touch the test pieces of tactile symbols freely with both hands. These tactile symbols were selected because they were likely to be easily distinguishable symbols on the content labels of top surfaces of containers and packaging based on a hearing survey that involved employees of an organization of visually impaired and a social welfare corporation, as well as academic experts of support technology for visually impaired. The participants then answered questions related to ease of distinguishing of tactile symbols on a scale of 5 (where 1 corresponded to ‘difficult to distinguish’ and 5 corresponded to ‘easy to distinguish’). Hearing surveys were also performed in an oral free answer manner with the participants after the experiments. This study revealed the shapes and dimensions regarding easily distinguishable tactile symbols attached to containers and packaging. We expect that this knowledge contributes to improvement of the quality of life of visually impaired persons.

Keywords: visual impairment, accessible design, tactile symbol, containers and packaging

Procedia PDF Downloads 219
25118 Bible of Hospitality: Considering the Hotel Business through the Prism of the Evangelical Approach

Authors: Rimma Kiseleva

Abstract:

The hotel business has a long history. The basis of the service of hospitality industry enterprises is the service, attitude, and consciousness of employees as hospitable “hosts of the house”. It is generally accepted that the founder and main expert of quality service is Caesar Ritz, “the king of hoteliers and the hotelier of kings.” However when deeply immersed in the history of the universe, it turns out that the very first book about hospitality, standardization of guest reception processes and the basics of better service is nothing more than the Bible. A unique study on the topic of considering the Church as a hotel, as well as the hotel business itself as the most gracious work of Jesus Christ Himself, which is confirmed by verses from the Gospel, includes the following approaches: analytical, comparative, empirical. The study shows that it was Jesus Christ who became the founder of the rules of the most sacrificial service, real service to people, filled with brotherly love, humility, love for strangers, those qualities that are the foundation, the “three pillars” of the hospitality industry. And also that the hotel is the most charitable cause, which is still relevant today.

Keywords: Augustine Aurelius, Bible, Gospel, guest house, hospitality, hotel, humility, inn, Jesus Christ, Joseph Fletcher, New Testament, Paul Tillich, service, strangeness

Procedia PDF Downloads 51
25117 A Landscape of Research Data Repositories in Re3data.org Registry: A Case Study of Indian Repositories

Authors: Prashant Shrivastava

Abstract:

The purpose of this study is to explore re3dat.org registry to identify research data repositories registration workflow process. Further objective is to depict a graph for present development of research data repositories in India. Preliminarily with an approach to understand re3data.org registry framework and schema design then further proceed to explore the status of research data repositories of India in re3data.org registry. Research data repositories are getting wider relevance due to e-research concepts. Now available registry re3data.org is a good tool for users and researchers to identify appropriate research data repositories as per their research requirements. In Indian environment, a compatible National Research Data Policy is the need of the time to boost the management of research data. Registry for Research Data Repositories is a crucial tool to discover specific information in specific domain. Also, Research Data Repositories in India have not been studied. Re3data.org registry and status of Indian research data repositories both discussed in this study.

Keywords: research data, research data repositories, research data registry, re3data.org

Procedia PDF Downloads 324
25116 A Study of Cloud Computing Solution for Transportation Big Data Processing

Authors: Ilgin Gökaşar, Saman Ghaffarian

Abstract:

The need for fast processed big data of transportation ridership (eg., smartcard data) and traffic operation (e.g., traffic detectors data) which requires a lot of computational power is incontrovertible in Intelligent Transportation Systems. Nowadays cloud computing is one of the important subjects and popular information technology solution for data processing. It enables users to process enormous measure of data without having their own particular computing power. Thus, it can also be a good selection for transportation big data processing as well. This paper intends to examine how the cloud computing can enhance transportation big data process with contrasting its advantages and disadvantages, and discussing cloud computing features.

Keywords: big data, cloud computing, Intelligent Transportation Systems, ITS, traffic data processing

Procedia PDF Downloads 467
25115 Harmonic Data Preparation for Clustering and Classification

Authors: Ali Asheibi

Abstract:

The rapid increase in the size of databases required to store power quality monitoring data has demanded new techniques for analysing and understanding the data. One suggested technique to assist in analysis is data mining. Preparing raw data to be ready for data mining exploration take up most of the effort and time spent in the whole data mining process. Clustering is an important technique in data mining and machine learning in which underlying and meaningful groups of data are discovered. Large amounts of harmonic data have been collected from an actual harmonic monitoring system in a distribution system in Australia for three years. This amount of acquired data makes it difficult to identify operational events that significantly impact the harmonics generated on the system. In this paper, harmonic data preparation processes to better understanding of the data have been presented. Underlying classes in this data has then been identified using clustering technique based on the Minimum Message Length (MML) method. The underlying operational information contained within the clusters can be rapidly visualised by the engineers. The C5.0 algorithm was used for classification and interpretation of the generated clusters.

Keywords: data mining, harmonic data, clustering, classification

Procedia PDF Downloads 247
25114 Linguistic Summarization of Structured Patent Data

Authors: E. Y. Igde, S. Aydogan, F. E. Boran, D. Akay

Abstract:

Patent data have an increasingly important role in economic growth, innovation, technical advantages and business strategies and even in countries competitions. Analyzing of patent data is crucial since patents cover large part of all technological information of the world. In this paper, we have used the linguistic summarization technique to prove the validity of the hypotheses related to patent data stated in the literature.

Keywords: data mining, fuzzy sets, linguistic summarization, patent data

Procedia PDF Downloads 272
25113 Proposal of Data Collection from Probes

Authors: M. Kebisek, L. Spendla, M. Kopcek, T. Skulavik

Abstract:

In our paper we describe the security capabilities of data collection. Data are collected with probes located in the near and distant surroundings of the company. Considering the numerous obstacles e.g. forests, hills, urban areas, the data collection is realized in several ways. The collection of data uses connection via wireless communication, LAN network, GSM network and in certain areas data are collected by using vehicles. In order to ensure the connection to the server most of the probes have ability to communicate in several ways. Collected data are archived and subsequently used in supervisory applications. To ensure the collection of the required data, it is necessary to propose algorithms that will allow the probes to select suitable communication channel.

Keywords: communication, computer network, data collection, probe

Procedia PDF Downloads 360
25112 A Review on Big Data Movement with Different Approaches

Authors: Nay Myo Sandar

Abstract:

With the growth of technologies and applications, a large amount of data has been producing at increasing rate from various resources such as social media networks, sensor devices, and other information serving devices. This large collection of massive, complex and exponential growth of dataset is called big data. The traditional database systems cannot store and process such data due to large and complexity. Consequently, cloud computing is a potential solution for data storage and processing since it can provide a pool of resources for servers and storage. However, moving large amount of data to and from is a challenging issue since it can encounter a high latency due to large data size. With respect to big data movement problem, this paper reviews the literature of previous works, discusses about research issues, finds out approaches for dealing with big data movement problem.

Keywords: Big Data, Cloud Computing, Big Data Movement, Network Techniques

Procedia PDF Downloads 86
25111 Optimized Approach for Secure Data Sharing in Distributed Database

Authors: Ahmed Mateen, Zhu Qingsheng, Ahmad Bilal

Abstract:

In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.

Keywords: ER-schema, electronic record, P2P framework, API, query formulation

Procedia PDF Downloads 333
25110 5G Future Hyper-Dense Networks: An Empirical Study and Standardization Challenges

Authors: W. Hashim, H. Burok, N. Ghazaly, H. Ahmad Nasir, N. Mohamad Anas, A. F. Ismail, K. L. Yau

Abstract:

Future communication networks require devices that are able to work on a single platform but support heterogeneous operations which lead to service diversity and functional flexibility. This paper proposes two cognitive mechanisms termed cognitive hybrid function which is applied in multiple broadband user terminals in order to maintain reliable connectivity and preventing unnecessary interferences. By employing such mechanisms especially for future hyper-dense network, we can observe their performances in terms of optimized speed and power saving efficiency. Results were obtained from several empirical laboratory studies. It was found that selecting reliable network had shown a better optimized speed performance up to 37% improvement as compared without such function. In terms of power adjustment, our evaluation of this mechanism can reduce the power to 5dB while maintaining the same level of throughput at higher power performance. We also discuss the issues impacting future telecommunication standards whenever such devices get in place.

Keywords: dense network, intelligent network selection, multiple networks, transmit power adjustment

Procedia PDF Downloads 376
25109 Cultural Statistics in Governance: A Comparative Analysis between the UK and Finland

Authors: Sandra Toledo

Abstract:

There is an increasing tendency in governments for a more evidence-based policy-making and a stricter auditing of public spheres. Especially when budgets are tight, and taxpayers demand a bigger scrutiny over the use of the available resources, statistics and numbers appeared as an effective tool to produce data that supports investments done, as well as evaluating public policy performance. This pressure has not exempted the cultural and art fields. Finland like the rest of Nordic countries has kept its principles from the welfare state, whilst UK seems to be going towards the opposite direction, relaying more and more in private sectors and foundations, as the state folds back. The boom of the creative industries along with a managerial trend introduced by Tatcher in the UK brought, as a result, a commodification of arts within a market logic, where sponsorship and commercial viability were the keynotes. Finland on its part, in spite of following a more protectionist approach of arts, seems to be heading in a similar direction. Additionally, there is an international growing interest in the application of cultural participation studies and the comparability between countries in their results. Nonetheless, the standardization in the application of cultural surveys has not happened yet. Not only there are differences in the application of these type of surveys in terms of time and frequency, but also regarding those conducting them. Therefore, one hypothesis considered in this research is that behind the differences between countries in the application of cultural surveys, production and utilization of cultural statistics is the cultural policy model adopted by the government. In other words, the main goal of this research is to answer the following: What are the differences and similarities between Finland and the UK regarding the role cultural surveys have in cultural policy making? Along with other secondary questions such as: How does the cultural policy model followed by each country influence the role of cultural surveys in cultural policy making? and what are the differences at the local level? In order to answer these questions, strategic cultural policy documents and interviews with key informants will be used and analyzed as source data, using content analysis methods. Cultural statistics per se will not be compared, but instead their use as instruments of governing, and its relation to the cultural policy model. Aspects such as execution of cultural surveys, funding, periodicity, and use of statistics in formal reports and publications, will be studied in the written documents while in the interviews other elements such as perceptions from those involved in collecting cultural statistics or policy making, distribution of tasks and hierarchies among cultural and statistical institutions, and a general view will be the target. A limitation identified beforehand and that it is expected to encounter throughout the process is the language barrier in the case of Finland when it comes to official documents, which will be tackled by interviewing the authors of such papers and choosing key extract of them for translation.

Keywords: Finland, cultural statistics, cultural surveys, United Kingdom

Procedia PDF Downloads 234
25108 Standardization of the Behavior Assessment System for Children-2, Parent Rating Scales - Adolescent Form (K BASC-2, PRS-A) among Korean Sample

Authors: Christine Myunghee Ahn, Sung Eun Baek, Sun Young Park

Abstract:

The purpose of this study was to evaluate the cross-cultural validity of the Korean version of the Behavioral Assessment System for Children 2nd Edition, Parent Rating Scales - Adolescent Form (K BASC-2, PRS-A). The 150-item K BASC-2, PRS-A questionnaire was administered to a total of 690 Korean parents or caregivers (N=690) of adolescent children in middle school and high school. Results from the confirmatory and exploratory factor analyses indicate that the K BASC-2, PRS-A yielded a 3-factor solution similar to the factor structure found in the original version of the BASC-2. The internal consistencies using the Cronbach’s alpha of the composite scale scores were in the .92~ .98 range. The overall reliability and validity of the K BASC-2, PRS-A seem adequate. Structural equation modeling was used to verify the theoretical relationship among the scales of Adaptability, Withdrawal, Somatization, Depression, and Anxiety, to render additional support for internal validity. Other relevant findings, practical implications regarding the use of the KBASC-2, PRS-A and suggestions for future research are discussed.

Keywords: behavioral assessment system, cross-cultural validity, parent report, screening

Procedia PDF Downloads 489
25107 Data Mining Algorithms Analysis: Case Study of Price Predictions of Lands

Authors: Julio Albuja, David Zaldumbide

Abstract:

Data analysis is an important step before taking a decision about money. The aim of this work is to analyze the factors that influence the final price of the houses through data mining algorithms. To our best knowledge, previous work was researched just to compare results. Furthermore, before using the data of the data set, the Z-Transformation were used to standardize the data in the same range. Hence, the data was classified into two groups to visualize them in a readability format. A decision tree was built, and graphical data is displayed where clearly is easy to see the results and the factors' influence in these graphics. The definitions of these methods are described, as well as the descriptions of the results. Finally, conclusions and recommendations are presented related to the released results that our research showed making it easier to apply these algorithms using a customized data set.

Keywords: algorithms, data, decision tree, transformation

Procedia PDF Downloads 374
25106 Application of Blockchain Technology in Geological Field

Authors: Mengdi Zhang, Zhenji Gao, Ning Kang, Rongmei Liu

Abstract:

Management and application of geological big data is an important part of China's national big data strategy. With the implementation of a national big data strategy, geological big data management becomes more and more critical. At present, there are still a lot of technology barriers as well as cognition chaos in many aspects of geological big data management and application, such as data sharing, intellectual property protection, and application technology. Therefore, it’s a key task to make better use of new technologies for deeper delving and wider application of geological big data. In this paper, we briefly introduce the basic principle of blockchain technology at the beginning and then make an analysis of the application dilemma of geological data. Based on the current analysis, we bring forward some feasible patterns and scenarios for the blockchain application in geological big data and put forward serval suggestions for future work in geological big data management.

Keywords: blockchain, intellectual property protection, geological data, big data management

Procedia PDF Downloads 88
25105 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs

Authors: Junaid Mahmood Alam

Abstract:

Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.

Keywords: revalidation, standardized, IFCC, CAP, harmonized

Procedia PDF Downloads 269
25104 Standardization of Propagation Techniques for Celastrus paniculata: An Endangered Medicinal Plant of Western Ghats

Authors: Raviraja Shetty G., K. G. Poojitha

Abstract:

An experiment was conducted at College of Horticulture, Mudigere to study the effect of different growth regulators on seed germination and vegetative propagation by cuttings of Celastrus paniculata an endangered medicinal plant. The extracted seeds are subjected to 11 different pre-soaking treatments which include control, GA3 at 300, 350, 400ppm, KNO3 at 1.0%, 1.5%, 2.0%, H2SO4 at 0.5%, 1.0% and HCl 0.5%,1.0% for 100 seeds per treatment. Among the different germination inducing treatments, seeds treated with gibberellins responded well with high seed germination and vigorous seedling growth. The seeds treated with GA3 400 ppm recorded maximum germination and growth parameters like rate of germination, shoot length, root length, plant vigour, fresh and dry weight of which was followed GA3 350 ppm. The commencement of germination and 50 per cent germination was also earlier in the same treatment. The cuttings of C. paniculata took more time for root initiation up to four months and sprouting percent was moderate as compared to other easy to root species. Among different treatments, IBA 2000 ppm was found to be the best, which recorded the maximum shoot and also root parameters. The results of present investigation will be helpful for conservation of this endangered medicinal plant through propagation

Keywords: conservation, germination, growth, germination, propagation

Procedia PDF Downloads 430
25103 Development & Standardization of a Literacy Free Cognitive Rehabilitation Program for Patients Post Traumatic Brain Injury

Authors: Sakshi Chopra, Ashima Nehra, Sumit Sinha, Harsimarpreet Kaur, Ravindra Mohan Pandey

Abstract:

Background: Cognitive rehabilitation aims to retrain brain injured individuals with cognitive deficits to restore or compensate lost functions. As illiterates or people with low literacy levels represent a significant proportion of the world, specific rehabilitation modules for such populations are indispensable. Literacy is significantly associated with all neuropsychological measures and retraining programs widely use written or spoken techniques which essentially require the patient to read or write. So, the aim of the study was to develop and standardize a literacy free neuropsychological rehabilitation program for improving cognitive functioning in patients with mild and moderate Traumatic Brain Injury (TBI). Several studies have pointed out to the impairments seen in memory, executive functioning, and attention and concentration post-TBI, so the rehabilitation program focussed on these domains. Visual item memorization, stick constructions, symbol cancellations, and colouring techniques were used to construct the retraining program. Methodology: The development of the program consisted of planning, preparing, analyzing, and revising the different modules. The construction focussed on areas of retraining immediate and delayed visual memory, planning ability, focused and divided attention, concentration, and response inhibition (to control irritability and aggression). A total of 98 home based retraining modules were prepared in the 4 domains (42 for memory, 42 for executive functioning, 7 for attention and concentration, and 7 for response inhibition). The standardization was done on 20 healthy controls to review, select and edit items. For each module, the time, errors made and errors per second were noted down, to establish the difficulty level of each module and were arranged in increasing level of difficulty over a period of 6 weeks. The retraining tasks were then administered on 11 brain injured individuals (5 after Mild TBI and 6 after Moderate TBI). These patients were referred from the Trauma Centre to Clinical Neuropsychology OPD, All India Institute of Medical Sciences, New Delhi, India. Results: The time was taken, errors made and errors per second were analysed for all domains. Education levels were divided into illiterates, up to 10 years, 10 years to graduation and graduation and above. Mean and standard deviations were calculated. Between group and within group analysis was done using the t-test. The performance of 20 healthy controls was analyzed and only a significant difference was observed on the time taken for the attention tasks and all other domains had non-significant differences in performance between different education levels. Comparing the errors, time taken between patient and control group, there was a significant difference in all the domains at the 0.01 level except the errors made on executive functioning, indicating that the tool can successfully differentiate between healthy controls and patient groups. Conclusions: Apart from the time taken for symbol cancellations, the entire cognitive rehabilitation program is literacy free. As it taps the major areas of impairment post-TBI, it could be a useful tool to rehabilitate the patient population with low literacy levels across the world. The next step is already underway to test its efficacy in improving cognitive functioning in a randomized clinical controlled trial.

Keywords: cognitive rehabilitation, illiterates, India, traumatic brain injury

Procedia PDF Downloads 333
25102 Frequent Item Set Mining for Big Data Using MapReduce Framework

Authors: Tamanna Jethava, Rahul Joshi

Abstract:

Frequent Item sets play an essential role in many data Mining tasks that try to find interesting patterns from the database. Typically it refers to a set of items that frequently appear together in transaction dataset. There are several mining algorithm being used for frequent item set mining, yet most do not scale to the type of data we presented with today, so called “BIG DATA”. Big Data is a collection of large data sets. Our approach is to work on the frequent item set mining over the large dataset with scalable and speedy way. Big Data basically works with Map Reduce along with HDFS is used to find out frequent item sets from Big Data on large cluster. This paper focuses on using pre-processing & mining algorithm as hybrid approach for big data over Hadoop platform.

Keywords: frequent item set mining, big data, Hadoop, MapReduce

Procedia PDF Downloads 434
25101 The Role Of Data Gathering In NGOs

Authors: Hussaini Garba Mohammed

Abstract:

Background/Significance: The lack of data gathering is affecting NGOs world-wide in general to have good data information about educational and health related issues among communities in any country and around the world. For example, HIV/AIDS smoking (Tuberculosis diseases) and COVID-19 virus carriers is becoming a serious public health problem, especially among old men and women. But there is no full details data survey assessment from communities, villages, and rural area in some countries to show the percentage of victims and patients, especial with this world COVID-19 virus among the people. These data are essential to inform programming targets, strategies, and priorities in getting good information about data gathering in any society.

Keywords: reliable information, data assessment, data mining, data communication

Procedia PDF Downloads 179
25100 A Case for Ethics Practice under the Revised ISO 14001:2015

Authors: Reuben Govender, M. L. Woermann

Abstract:

The ISO 14001 management system standard was first published in 1996. It is a voluntary standard adopted by both private and public sector organizations globally. Adoption of the ISO 14001 standard at the corporate level is done to help manage business impacts on the environment e.g. pollution control. The International Organization for Standardization (ISO) revised the standard in 2004 and recently in 2015. The current revision of the standard appears to adopt a communitarian-type philosophy. The inclusion of requirements to consider external 'interested party' needs and expectations implies this philosophy. Therefore, at operational level businesses implementing ISO 14001 will have to consider needs and expectations beyond local laws. Should these external needs and expectations be included in the scope of the environmental management system, they become requirements to be complied with in much the same way as compliance to laws. The authors assert that the recent changes to ISO 14001 introduce an ethical dimension to the standard. The authors assert that business ethics as a discipline now finds relevance in ISO 14001 via contemporary stakeholder theory and discourse ethics. Finally, the authors postulate implications of (not) addressing these requirements before July 2018 when transition to the revised standard must be complete globally.

Keywords: business ethics, environmental ethics, ethics practice, ISO 14001:2015

Procedia PDF Downloads 261
25099 The Application of Data Mining Technology in Building Energy Consumption Data Analysis

Authors: Liang Zhao, Jili Zhang, Chongquan Zhong

Abstract:

Energy consumption data, in particular those involving public buildings, are impacted by many factors: the building structure, climate/environmental parameters, construction, system operating condition, and user behavior patterns. Traditional methods for data analysis are insufficient. This paper delves into the data mining technology to determine its application in the analysis of building energy consumption data including energy consumption prediction, fault diagnosis, and optimal operation. Recent literature are reviewed and summarized, the problems faced by data mining technology in the area of energy consumption data analysis are enumerated, and research points for future studies are given.

Keywords: data mining, data analysis, prediction, optimization, building operational performance

Procedia PDF Downloads 852
25098 To Handle Data-Driven Software Development Projects Effectively

Authors: Shahnewaz Khan

Abstract:

Machine learning (ML) techniques are often used in projects for creating data-driven applications. These tasks typically demand additional research and analysis. The proper technique and strategy must be chosen to ensure the success of data-driven projects. Otherwise, even exerting a lot of effort, the necessary development might not always be possible. In this post, an effort to examine the workflow of data-driven software development projects and its implementation process in order to describe how to manage a project successfully. Which will assist in minimizing the added workload.

Keywords: data, data-driven projects, data science, NLP, software project

Procedia PDF Downloads 83