Search results for: survival data analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 41942

Search results for: survival data analysis

38732 A DEA Model in a Multi-Objective Optimization with Fuzzy Environment

Authors: Michael Gidey Gebru

Abstract:

Most DEA models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp DEA into DEA with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the DEA model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units’ efficiency. Finally, the developed DEA model is illustrated with an application on real data 50 educational institutions.

Keywords: efficiency, DEA, fuzzy, decision making units, higher education institutions

Procedia PDF Downloads 43
38731 Geological Mapping of Gabel Humr Akarim Area, Southern Eastern Desert, Egypt: Constrain from Remote Sensing Data, Petrographic Description and Field Investigation

Authors: Doaa Hamdi, Ahmed Hashem

Abstract:

The present study aims at integrating the ASTER data and Landsat 8 data to discriminate and map alteration and/or mineralization zones in addition to delineating different lithological units of Humr Akarim Granites area. The study area is located at 24º9' to 24º13' N and 34º1' to 34º2'45"E., covering a total exposed surface area of about 17 km². The area is characterized by rugged topography with low to moderate relief. Geologic fieldwork and petrographic investigations revealed that the basement complex of the study area is composed of metasediments, mafic dikes, older granitoids, and alkali-feldspar granites. Petrographic investigations revealed that the secondary minerals in the study area are mainly represented by chlorite, epidote, clay minerals and iron oxides. These minerals have specific spectral signatures in the region of visible near-infrared and short-wave infrared (0.4 to 2.5 µm). So that the ASTER imagery processing was concentrated on VNIR-SWIR spectrometric data in order to achieve the purposes of this study (geologic mapping of hydrothermal alteration zones and delineate possible radioactive potentialities). Mapping of hydrothermal alterations zones in addition to discriminating the lithological units in the study area are achieved through the utilization of some different image processing, including color band composites (CBC) and data transformation techniques such as band ratios (BR), band ratio codes (BRCs), principal component analysis(PCA), Crosta Technique and minimum noise fraction (MNF). The field verification and petrographic investigation confirm the results of ASTER imagery and Landsat 8 data, proposing a geological map (scale 1:50000).

Keywords: remote sensing, petrography, mineralization, alteration detection

Procedia PDF Downloads 154
38730 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis

Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho

Abstract:

This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.

Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis

Procedia PDF Downloads 175
38729 Metacognition Skill on Collaborative Study with Self Evaluation

Authors: Suratno

Abstract:

Metacognition thinking skills should be developed early on in learning. The aim of research builds metacognition thinking skills through collaborative learning with self-evaluation. Approach to action research study involving 32 middle school students in Jember Indonesia. Indicators metacognition skills consist of planning, information management strategies, comprehension monitoring, and debugging strategies. Data were analyzed by t test and analysis of instructional videos. Results of the study here were significant differences in metacognition skills before and after the implementation of collaborative learning with self-evaluation. Analysis instructional video showing the difference artifacts of student learning activities to learning before and after implementation of collaborative learning with self-evaluation. Self-evaluation makes students familiar practice thinking skills metacognition.

Keywords: metacognition, collaborative, evaluation, thinking skills

Procedia PDF Downloads 349
38728 Discursive Psychology of Emotions in Mediation

Authors: Katarzyna Oberda

Abstract:

The aim of this paper is to conceptual emotions in the process of mediation. Although human emotions have been approached from various disciplines and perspectives, e.g. philosophy, linguistics, psychology and neurology, this complex phenomenon still needs further investigation into its discursive character with the an open mind and heart. To attain this aim, the theoretical and practical considerations are taken into account both to contextualize the discursive psychology of emotions in mediation and show how cognitive and linguistic activity expressed in language may lead to the emotional turn in the process of mediation. The double directions of this research into the discursive psychology of emotions have been partially inspired by the evaluative components of mediation forms. In the conducted research, we apply the methodology of discursive psychology with the discourse analysis as a tool. The practical data come from the recorded mediations online. The major findings of the conducted research result in the reconstruction of the emotional transformation model in mediation.

Keywords: discourse analysis, discursive psychology, emotions, mediation

Procedia PDF Downloads 149
38727 Axial Load Capacity of Drilled Shafts from In-Situ Test Data at Semani Site, in Albania

Authors: Neritan Shkodrani, Klearta Rrushi, Anxhela Shaha

Abstract:

Generally, the design of axial load capacity of deep foundations is based on the data provided from field tests, such as SPT (Standard Penetration Test) and CPT (Cone Penetration Test) tests. This paper reports the results of axial load capacity analysis of drilled shafts at a construction site at Semani, in Fier county, Fier prefecture in Albania. In this case, the axial load capacity analyses are based on the data of 416 SPT tests and 12 CPTU tests, which are carried out in this site construction using 12 boreholes (10 borings of a depth 30.0 m and 2 borings of a depth of 80.0m). The considered foundation widths range from 0.5m to 2.5 m and foundation embedment lengths is fixed at a value of 25m. SPT – based analytical methods from the Japanese practice of design (Building Standard Law of Japan) and CPT – based analytical Eslami and Fellenius methods are used for obtaining axial ultimate load capacity of drilled shafts. The considered drilled shaft (25m long and 0.5m - 2.5m in diameter) is analyzed for the soil conditions of each borehole. The values obtained from sets of calculations are shown in different charts. Then the reported axial load capacity values acquired from SPT and CPTU data are compared and some conclusions are found related to the mentioned methods of calculations.

Keywords: deep foundations, drilled shafts, axial load capacity, ultimate load capacity, allowable load capacity, SPT test, CPTU test

Procedia PDF Downloads 100
38726 Prevalence of Dietary Supplements among University Athlete Regime in Sri Lanka: A Cross-Sectional Study

Authors: S. A. N. Rashani, S. Pigera, P. N. J. Fernando, S. Jayawickema, M. A. Niriella, A. P. De Silva

Abstract:

Dietary supplement (DS) consumption is drastically trending among the young athlete generation in developing countries. Many athletes try to fulfill their nutrition requirements using dietary supplements without knowing their effects on health and performance. This study aimed to assess the DS usage patterns of university athletes in Sri Lanka. A self-administered questionnaire was employed to collect data from state university students representing a university team, and a sample of 200 respondents was selected based on a stratified random sampling technique. Incomplete questionnaires were omitted from the analysis. The data were analyzed using IBM SPSS statistics for Windows version 25. The level of significance was set at p<0.05 in the data analysis. The prevalence of DS was 48.2% (n= 94), with no significant association between gender and DS intake. Protein (15.9%), vitamin (14.9%), sports drinks (12.8%), and creatine (8.2%) were the most consumed DS by students. Weightlifting (85.0%), football (62.5%), rugby (57.7%), and wrestling (40.9%) players showed higher DS usage among other sports. Coaches were reported as the most frequent person who was advised to use DS (43.0%). Students who won interuniversity games showed significantly low DS intake (p = 0.002) compared to others. Interestingly, DS use was significantly affected by the season of use (p = 0.000), pointing out that during competition and training seasons (62.4%) was the most frequent use. The pharmacy (27.0%) was the commonest place to buy DS. Students who used nutrient-dense meal plans during the training and competition period still showed a 61.0% tendency to consume DS. Most claimed reason to use DS was to increase energy and strength (29.0%). A majority reported that they used DS for less than one month (35.5%), while the second-highest duration was over three years (17.2%). Considering body mass index (BMI), healthy weight students showed 71.0% DS prevalence. DS prevalence was moderate among Sri Lankan university students, highlighting that the highest DS use was during competition and training seasons. Moreover, it emphasizes the need for nutrition and anti-doping counseling in the Sri Lankan university system.

Keywords: athlete, dietary, supplements, university

Procedia PDF Downloads 193
38725 Automated CNC Part Programming and Process Planning for Turned Components

Authors: Radhey Sham Rajoria

Abstract:

Pressure to increase the competitiveness in the manufacturing sector and for the survival in the market has led to the development of machining centres, which enhance productivity, improve quality, shorten the lead time, and reduce the manufacturing cost. With the innovation of machining centres in the manufacturing sector the production lines have been replaced by these machining centers, having the ability to machine various processes and multiple tooling with automatic tool changer (ATC) for the same part. Also the process plans can be easily generated for complex components. Some means are required to utilize the machining center at its best. The present work is concentrated on the automated part program generation, and in turn automated process plan generation for the turned components on Denford “MIRAC” 8 stations ATC lathe machining centre. A package in C++ on DOS platform is developed which generates the complete CNC part program, process plan and process sequence for the turned components. The input to this system is in the form of a blueprint in graphical format with machining parameters and variables, and the output is the CNC part program which is stored in a .mir file, ready for execution on the machining centre.

Keywords: CNC, MIRAC, ATC, process planning

Procedia PDF Downloads 263
38724 Data-Driven Decision Making: Justification of Not Leaving Class without It

Authors: Denise Hexom, Judith Menoher

Abstract:

Teachers and administrators across America are being asked to use data and hard evidence to inform practice as they begin the task of implementing Common Core State Standards. Yet, the courses they are taking in schools of education are not preparing teachers or principals to understand the data-driven decision making (DDDM) process nor to utilize data in a much more sophisticated fashion. DDDM has been around for quite some time, however, it has only recently become systematically and consistently applied in the field of education. This paper discusses the theoretical framework of DDDM; empirical evidence supporting the effectiveness of DDDM; a process a department in a school of education has utilized to implement DDDM; and recommendations to other schools of education who attempt to implement DDDM in their decision-making processes and in their students’ coursework.

Keywords: data-driven decision making, institute of higher education, special education, continuous improvement

Procedia PDF Downloads 379
38723 Netnography Research in Leisure, Tourism, and Hospitality: Lessons from Research and Education

Authors: Marisa P. De Brito

Abstract:

The internet is affecting the way the industry operates and communicates. It is also becoming a customary means for leisure, tourism, and hospitality consumers to seek and exchange information and views on hotels, destinations events and attractions, or to develop social ties with other users. On the one hand, the internet is a rich field to conduct leisure, tourism, and hospitality research; on the other hand, however, there are few researchers formally embracing online methods of research, such as netnography. Within social sciences, netnography falls under the interpretative/ethnographic research methods umbrella. It is an adaptation of anthropological techniques such as participant and non-participant observation, used to study online interactions happening on social media platforms, such as Facebook. It is, therefore, a research method applied to the study of online communities, being the term itself a contraction of the words network (as on internet), and ethnography. It was developed in the context of marketing research in the nineties, and in the last twenty years, it has spread to other contexts such as education, psychology, or urban studies. Since netnography is not universally known, it may discourage researchers and educators from using it. This work offers guidelines for researchers wanting to apply this method in the field of leisure, tourism, and hospitality or for educators wanting to teach about it. This is done by means of a double approach: a content analysis of the literature side-by-side with educational data, on the use of netnography. The content analysis is of the incidental research using netnography in leisure, tourism, and hospitality in the last twenty years. The educational data is the author and her colleagues’ experience in coaching students throughout the process of writing a paper using primary netnographic data - from identifying the phenomenon to be studied, selecting an online community, collecting and analyzing data to writing their findings. In the end, this work puts forward, on the one hand, a research agenda, and on the other hand, an educational roadmap for those wanting to apply netnography in the field or the classroom. The educator’s roadmap will summarise what can be expected from mini-netnographies conducted by students and how to set it up. The research agenda will highlight for which issues and research questions the method is most suitable; what are the most common bottlenecks and drawbacks of the method and of its application, but also where most knowledge opportunities lay.

Keywords: netnography, online research, research agenda, educator's roadmap

Procedia PDF Downloads 173
38722 Performance Analysis of Artificial Neural Network with Decision Tree in Prediction of Diabetes Mellitus

Authors: J. K. Alhassan, B. Attah, S. Misra

Abstract:

Human beings have the ability to make logical decisions. Although human decision - making is often optimal, it is insufficient when huge amount of data is to be classified. medical dataset is a vital ingredient used in predicting patients health condition. In other to have the best prediction, there calls for most suitable machine learning algorithms. This work compared the performance of Artificial Neural Network (ANN) and Decision Tree Algorithms (DTA) as regards to some performance metrics using diabetes data. The evaluations was done using weka software and found out that DTA performed better than ANN. Multilayer Perceptron (MLP) and Radial Basis Function (RBF) were the two algorithms used for ANN, while RegTree and LADTree algorithms were the DTA models used. The Root Mean Squared Error (RMSE) of MLP is 0.3913,that of RBF is 0.3625, that of RepTree is 0.3174 and that of LADTree is 0.3206 respectively.

Keywords: artificial neural network, classification, decision tree algorithms, diabetes mellitus

Procedia PDF Downloads 403
38721 Conception of a Predictive Maintenance System for Forest Harvesters from Multiple Data Sources

Authors: Lazlo Fauth, Andreas Ligocki

Abstract:

For cost-effective use of harvesters, expensive repairs and unplanned downtimes must be reduced as far as possible. The predictive detection of failing systems and the calculation of intelligent service intervals, necessary to avoid these factors, require in-depth knowledge of the machines' behavior. Such know-how needs permanent monitoring of the machine state from different technical perspectives. In this paper, three approaches will be presented as they are currently pursued in the publicly funded project PreForst at Ostfalia University of Applied Sciences. These include the intelligent linking of workshop and service data, sensors on the harvester, and a special online hydraulic oil condition monitoring system. Furthermore the paper shows potentials as well as challenges for the use of these data in the conception of a predictive maintenance system.

Keywords: predictive maintenance, condition monitoring, forest harvesting, forest engineering, oil data, hydraulic data

Procedia PDF Downloads 131
38720 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study

Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy

Abstract:

Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.

Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy

Procedia PDF Downloads 118
38719 Game of Funds: Efficiency and Policy Implications of the United Kingdom Research Excellence Framework

Authors: Boon Lee

Abstract:

Research publication is an essential output of universities because it not only promotes university recognition, it also receives government funding. The history of university research culture has been one of ‘publish or perish’ and universities have consistently encouraged their academics and researchers to produce research articles in reputable journals in order to maintain a level of competitiveness. In turn, the United Kingdom (UK) government funding is determined by the number and quality of research publications. This paper aims to investigate on whether more government funding leads to more quality papers. To that end, the paper employs a Network DEA model to evaluate the UK higher education performance over a period. Sources of efficiency are also determined via second stage regression analysis.

Keywords: efficiency, higher education, network data envelopment analysis, universities

Procedia PDF Downloads 110
38718 Sampled-Data Control for Fuel Cell Systems

Authors: H. Y. Jung, Ju H. Park, S. M. Lee

Abstract:

A sampled-data controller is presented for solid oxide fuel cell systems which is expressed by a sector bounded nonlinear model. The sector bounded nonlinear systems, which have a feedback connection with a linear dynamical system and nonlinearity satisfying certain sector type constraints. Also, the sampled-data control scheme is very useful since it is possible to handle digital controller and increasing research efforts have been devoted to sampled-data control systems with the development of modern high-speed computers. The proposed control law is obtained by solving a convex problem satisfying several linear matrix inequalities. Simulation results are given to show the effectiveness of the proposed design method.

Keywords: sampled-data control, fuel cell, linear matrix inequalities, nonlinear control

Procedia PDF Downloads 562
38717 Comparison of Different Data Acquisition Techniques for Shape Optimization Problems

Authors: Attila Vámosi, Tamás Mankovits, Dávid Huri, Imre Kocsis, Tamás Szabó

Abstract:

Non-linear FEM calculations are indispensable when important technical information like operating performance of a rubber component is desired. Rubber bumpers built into air-spring structures may undergo large deformations under load, which in itself shows non-linear behavior. The changing contact range between the parts and the incompressibility of the rubber increases this non-linear behavior further. The material characterization of an elastomeric component is also a demanding engineering task. The shape optimization problem of rubber parts led to the study of FEM based calculation processes. This type of problems was posed and investigated by several authors. In this paper the time demand of certain calculation methods are studied and the possibilities of time reduction is presented.

Keywords: rubber bumper, data acquisition, finite element analysis, support vector regression

Procedia PDF Downloads 467
38716 Working Capital Management and Profitability of Uk Firms: A Contingency Theory Approach

Authors: Ishmael Tingbani

Abstract:

This paper adopts a contingency theory approach to investigate the relationship between working capital management and profitability using data of 225 listed British firms on the London Stock Exchange for the period 2001-2011. The paper employs a panel data analysis on a series of interactive models to estimate this relationship. The findings of the study confirm the relevance of the contingency theory. Evidence from the study suggests that the impact of working capital management on profitability varies and is constrained by organizational contingencies (environment, resources, and management factors) of the firm. These findings have implications for a more balanced and nuanced view of working capital management policy for policy-makers.

Keywords: working capital management, profitability, contingency theory approach, interactive models

Procedia PDF Downloads 334
38715 Statistical Analysis Approach for the e-Glassy Mortar And Radiation Shielding Behaviors Using Anova

Authors: Abadou Yacine, Faid Hayette

Abstract:

Significant investigations were performed on the use and impact on physical properties along with the mechanical strength of the recycled and reused E-glass waste powder. However, it has been modelled how recycled display e-waste glass may affect the characteristics and qualities of dune sand mortar. To be involved in this field, an investigation has been done with the substitution of dune sand for recycled E-glass waste and constant water-cement ratios. The linear relationship between the dune sand mortar and E-glass mortar mix % contributes to the model's reliability. The experimental data was exposed to regression analysis using JMP Statistics software. The regression model with one predictor presented the general form of the equation for the prediction of the five properties' characteristics of dune sand mortar from the substitution ratio of E-waste glass and curing age. The results illustrate that curing a long-term process produced an E-glass waste mortar specimen with the highest compressive strength of 68 MPa in the laboratory environment. Anova analysis indicated that the curing at long-term has the utmost importance on the sorptivity level and ultrasonic pulse velocity loss. Furthermore, the E-glass waste powder percentage has the utmost importance on the compressive strength and improvement in dynamic elasticity modulus. Besides, a significant enhancement of radiation-shielding applications.

Keywords: ANOVA analysis, E-glass waste, durability and sustainability, radiation-shielding

Procedia PDF Downloads 54
38714 How Western Donors Allocate Official Development Assistance: New Evidence From a Natural Language Processing Approach

Authors: Daniel Benson, Yundan Gong, Hannah Kirk

Abstract:

Advancement in national language processing techniques has led to increased data processing speeds, and reduced the need for cumbersome, manual data processing that is often required when processing data from multilateral organizations for specific purposes. As such, using named entity recognition (NER) modeling and the Organisation of Economically Developed Countries (OECD) Creditor Reporting System database, we present the first geotagged dataset of OECD donor Official Development Assistance (ODA) projects on a global, subnational basis. Our resulting data contains 52,086 ODA projects geocoded to subnational locations across 115 countries, worth a combined $87.9bn. This represents the first global, OECD donor ODA project database with geocoded projects. We use this new data to revisit old questions of how ‘well’ donors allocate ODA to the developing world. This understanding is imperative for policymakers seeking to improve ODA effectiveness.

Keywords: international aid, geocoding, subnational data, natural language processing, machine learning

Procedia PDF Downloads 65
38713 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments

Authors: Ana Londral, Burcu Demiray, Marcus Cheetham

Abstract:

Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.

Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation

Procedia PDF Downloads 278
38712 Role of Pulp Volume Method in Assessment of Age and Gender in Lucknow, India, an Observational Study

Authors: Anurag Tripathi, Sanad Khandelwal

Abstract:

Age and gender determination are required in forensic for victim identification. There is secondary dentine deposition throughout life, resulting in decreased pulp volume and size. Evaluation of pulp volume using Cone Beam Computed Tomography (CBCT)is a noninvasive method to evaluate the age and gender of an individual. The study was done to evaluate the efficacy of pulp volume method in the determination of age and gender.Aims/Objectives: The study was conducted to estimate age and determine sex by measuring tooth pulp volume with the help of CBCT. An observational study of one year duration on CBCT data of individuals was conducted in Lucknow. Maxillary central incisors (CI) and maxillary canine (C) of the randomly selected samples were assessed for measurement of pulp volume using a software. Statistical analysis: Chi Square Test, Arithmetic Mean, Standard deviation, Pearson’s Correlation, Linear & Logistic regression analysis. Results: The CBCT data of Ninety individuals with age range between 18-70 years was evaluated for pulp volume of central incisor and canine (CI & C). The Pearson correlation coefficient between the tooth pulp volume (CI & C) and chronological age suggested that pulp volume decreased with age. The validation of the equations for sex determination showed higher prediction accuracy for CI (56.70%) and lower for C (53.30%).Conclusion: Pulp volume obtained from CBCT is a reliable indicator for age estimation and gender prediction.

Keywords: forensic, dental age, pulp volume, cone beam computed tomography

Procedia PDF Downloads 93
38711 Development and Validation of a Coronary Heart Disease Risk Score in Indian Type 2 Diabetes Mellitus Patients

Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad

Abstract:

Diabetes in India is growing at an alarming rate and the complications caused by it need to be controlled. Coronary heart disease (CHD) is one of the complications that will be discussed for prediction in this study. India has the second most number of diabetes patients in the world. To the best of our knowledge, there is no CHD risk score for Indian type 2 diabetes patients. Any form of CHD has been taken as the event of interest. A sample of 750 was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of CHD. Predictive risk scores of CHD events are designed by cox proportional hazard regression. Model calibration and discrimination is assessed from Hosmer Lemeshow and area under receiver operating characteristic (ROC) curve. Overfitting and underfitting of the model is checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Youden’s index is used to choose the optimal cut off point from the scores. Five year probability of CHD is predicted by both survival function and Markov chain two state model and the better technique is concluded. The risk scores for CHD developed can be calculated by doctors and patients for self-control of diabetes. Furthermore, the five-year probabilities can be implemented as well to forecast and maintain the condition of patients.

Keywords: coronary heart disease, cox proportional hazard regression, ROC curve, type 2 diabetes Mellitus

Procedia PDF Downloads 216
38710 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano

Authors: Guo Wenyu, Qu Youli

Abstract:

A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.

Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA

Procedia PDF Downloads 247
38709 Investigation of Adherence to Treatment, Perception, and Predictors of Adherence among Patients with End-Stage Renal Disease on Haemodialysis in the Eastern Region of Saudi Arabia: A Descriptive Cross-Sectional Study

Authors: Rima Al Garni, Emad Al Shdaifat, Sahar Elmetwalli, Mohammad Alzaid, Abdulrahman Alghothayyan, Sara Al Abd Al Hai, Seham Al Rashidi

Abstract:

Aim: To investigate the prevalence of non-adherence of patients on haemodialysis and explore their perception of the importance of adherence to the therapeutic regime and estimate the predictors for adherence to the therapeutic regime. Background: End-stage renal disease is commonly treated by haemodialysis. Haemodialysis treatment alone is not effective in replacing kidney function. Diet and fluid restrictions, along with supplementary medications, are mandatory for the survival and well-being of patients. Hence, adherence to this therapeutic regimen is essential. However, non-adherence to diet and fluid restrictions, medications, and dialysis is common among patients on haemodialysis. Design: Descriptive cross-sectional method was applied to investigate the prevalence of non-adherence to treatment, including adherence to diet and fluid restrictions, medications, and dialysis sessions. Methods: Structured interviews were conducted using the Arabic version of the End-Stage Renal Disease Adherence Questionnaire. The sample included 230 patients undergoing haemodialysis in the Eastern Region of Saudi Arabia. Data were analysed using descriptive statistics and multiple regressions. Results/Findings: Most patients had good adherence (71.3%), and only 3.9% had poor adherence. The divorced or widowed patient had higher adherence compared with single (P=0.011) and married participants (P=0.045) through using the post hoc test. Patients above 60 years had higher adherence compared to patients below 40 years old (P=0.016) using the post hoc test. For the perception of the importance of adherence to the therapeutic regime subscale, two-thirds of the patients had lower scores (<=11). Conclusion: Adherence to therapeutic regime is high for three fourth of patients undergoing haemodialysis in the Eastern Region of Saudi Arabia; this finding is similar to results abstracted from the local literature. This result would help us highlight the needs of patients who are not compliant with their treatment plans and investigate the consequences of non-adherence on their well-being and general health. Hence, plan individualised therapeutic programmes that could raise their awareness and influence their adherence to therapeutic regimes.

Keywords: adherence to treatment, haemodialysis, end stage renal disease, diet and fluid restrictions

Procedia PDF Downloads 83
38708 Privacy Preserving in Association Rule Mining on Horizontally Partitioned Database

Authors: Manvar Sagar, Nikul Virpariya

Abstract:

The advancement in data mining techniques plays an important role in many applications. In context of privacy and security issues, the problems caused by association rule mining technique are investigated by many research scholars. It is proved that the misuse of this technique may reveal the database owner’s sensitive and private information to others. Many researchers have put their effort to preserve privacy in Association Rule Mining. Amongst the two basic approaches for privacy preserving data mining, viz. Randomization based and Cryptography based, the later provides high level of privacy but incurs higher computational as well as communication overhead. Hence, it is necessary to explore alternative techniques that improve the over-heads. In this work, we propose an efficient, collusion-resistant cryptography based approach for distributed Association Rule mining using Shamir’s secret sharing scheme. As we show from theoretical and practical analysis, our approach is provably secure and require only one time a trusted third party. We use secret sharing for privately sharing the information and code based identification scheme to add support against malicious adversaries.

Keywords: Privacy, Privacy Preservation in Data Mining (PPDM), horizontally partitioned database, EMHS, MFI, shamir secret sharing

Procedia PDF Downloads 398
38707 Parental Awareness and Willingness to Vaccinate Adolescent Daughters against Human Papilloma Virus for Cervical Cancer Prevention in Eastern Region of Kenya: Towards Affirmative Action

Authors: Jacinta Musyoka, Wesley Too

Abstract:

Cervical cancer is the second leading cause of cancer-related deaths in Kenya and the second most common cancer among women, yet preventable following prevention strategies put in place, which includes vaccination with Human Papilloma Virus Vaccine (HPPV) among the young adolescent girls. Kenya has the highest burden of cervical cancer and the leading cause of death among women of reproductive age and is a known frequent type of cancer amongst women. This is expected to double by 2025 if the necessary steps are not taken, which include vaccinating girls between the ages of 9 and 14 and screening women. Parental decision is critical in ensuring that their daughters receive this vaccine. Hence this study sought to establish parental willingness and factors associate with the acceptability to vaccine adolescent daughters against the human papilloma virus for cervical cancer prevention in Machakos County, Eastern Region of Kenya. Method: Cross-sectional study design utilizing a mixed methods approach was used to collect data from Nguluni Health Centre in Machakos County; Matungulu sub-county, Kenya. This study targeted all parents of adolescent girls seeking health care services in the Matungulu sub-county area who were aged 18 years and above. A total of 220 parents with adolescent girls aged 10-14 years were enrolled into the study after informed consent were sought. All ethical considerations were observed. Quantitative data were analyzed using Multivariate regression analysis, and thematic analysis was used for qualitative data related to perceptions of parents on HPVV. Results, conclusions, and recommendations- ongoing. We expect to report findings and articulate contributions based on the study findings in due course before October 2022

Keywords: adolescents, human papilloma virus, kenya, parents

Procedia PDF Downloads 106
38706 Speech Perception by Video Hosting Services Actors: Urban Planning Conflicts

Authors: M. Pilgun

Abstract:

The report presents the results of a study of the specifics of speech perception by actors of video hosting services on the material of urban planning conflicts. To analyze the content, the multimodal approach using neural network technologies is employed. Analysis of word associations and associative networks of relevant stimulus revealed the evaluative reactions of the actors. Analysis of the data identified key topics that generated negative and positive perceptions from the participants. The calculation of social stress and social well-being indices based on user-generated content made it possible to build a rating of road transport construction objects according to the degree of negative and positive perception by actors.

Keywords: social media, speech perception, video hosting, networks

Procedia PDF Downloads 141
38705 Data, Digital Identity and Antitrust Law: An Exploratory Study of Facebook’s Novi Digital Wallet

Authors: Wanjiku Karanja

Abstract:

Facebook has monopoly power in the social networking market. It has grown and entrenched its monopoly power through the capture of its users’ data value chains. However, antitrust law’s consumer welfare roots have prevented it from effectively addressing the role of data capture in Facebook’s market dominance. These regulatory blind spots are augmented in Facebook’s proposed Diem cryptocurrency project and its Novi Digital wallet. Novi, which is Diem’s digital identity component, shall enable Facebook to collect an unprecedented volume of consumer data. Consequently, Novi has seismic implications on internet identity as the network effects of Facebook’s large user base could establish it as the de facto internet identity layer. Moreover, the large tracts of data Facebook shall collect through Novi shall further entrench Facebook's market power. As such, the attendant lock-in effects of this project shall be very difficult to reverse. Urgent regulatory action is therefore required to prevent this expansion of Facebook’s data resources and monopoly power. This research thus highlights the importance of data capture to competition and market health in the social networking industry. It utilizes interviews with key experts to empirically interrogate the impact of Facebook’s data capture and control of its users’ data value chains on its market power. This inquiry is contextualized against Novi’s expansive effect on Facebook’s data value chains. It thus addresses the novel antitrust issues arising at the nexus of Facebook’s monopoly power and the privacy of its users’ data. It also explores the impact of platform design principles, specifically data portability and data portability, in mitigating Facebook’s anti-competitive practices. As such, this study finds that Facebook is a powerful monopoly that dominates the social media industry to the detriment of potential competitors. Facebook derives its power from its size, annexure of the consumer data value chain, and control of its users’ social graphs. Additionally, the platform design principles of data interoperability and data portability are not a panacea to restoring competition in the social networking market. Their success depends on the establishment of robust technical standards and regulatory frameworks.

Keywords: antitrust law, data protection law, data portability, data interoperability, digital identity, Facebook

Procedia PDF Downloads 117
38704 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data

Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca

Abstract:

In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.

Keywords: citizen science, data quality filtering, species distribution models, trait profiles

Procedia PDF Downloads 193
38703 Factors Promoting French-English Tweets in France

Authors: Taoues Hadour

Abstract:

Twitter has become a popular means of communication used in a variety of fields, such as politics, journalism, and academia. This widely used online platform has an impact on the way people express themselves and is changing language usage worldwide at an unprecedented pace. The language used online reflects the linguistic battle that has been going on for several decades in French society. This study enables a deeper understanding of users' linguistic behavior online. The implications are important and allow for a rise in awareness of intercultural and cross-language exchanges. This project investigates the mixing of French-English language usage among French users of Twitter using a topic analysis approach. This analysis draws on Gumperz's theory of conversational switching. In order to collect tweets at a large scale, the data was collected in R using the rtweet package to access and retrieve French tweets data through Twitter’s REST and stream APIs (Application Program Interface) using the software RStudio, the integrated development environment for R. The dataset was filtered manually and certain repetitions of themes were observed. A total of nine topic categories were identified and analyzed in this study: entertainment, internet/social media, events/community, politics/news, sports, sex/pornography, innovation/technology, fashion/make up, and business. The study reveals that entertainment is the most frequent topic discussed on Twitter. Entertainment includes movies, music, games, and books. Anglicisms such as trailer, spoil, and live are identified in the data. Change in language usage is inevitable and is a natural result of linguistic interactions. The use of different languages online is just an example of what the real world would look like without linguistic regulations. Social media reveals a multicultural and multilinguistic richness which can deepen and expand our understanding of contemporary human attitudes.

Keywords: code-switching, French, sociolinguistics, Twitter

Procedia PDF Downloads 129