Search results for: multivariate data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24620

Search results for: multivariate data

24050 Determinants of Domestic Violence among Married Women Aged 15-49 Years in Sierra Leone by an Intimate Partner: A Cross-Sectional Study

Authors: Tesfaldet Mekonnen Estifanos, Chen Hui, Afewerki Weldezgi

Abstract:

Background: Intimate partner violence (hereafter IPV) is a major global public health challenge that tortures and disables women in the place where they are ought to be most secure within their own families. The fact that the family unit is commonly viewed as a private circle, violent acts towards women remains undermined. There are limited research and knowledge about the influencing factors linked to IPV in Sierra Leone. This study, therefore, estimates the prevalence rate and the predicting factors associated with IPV. Methods: Data were taken from Sierra-Leone Demographic and Health Survey (SDHS, 2013): the first in its form to incorporate information on domestic violence. Multistage cluster sampling research design was used, and information was gathered by a standard questionnaire. A total of 5185 respondents selected were interviewed, out of whom 870 were never been in union, thus excluded. To analyze the two dependent variables: experience of IPV, ‘ever’ and 'last 12 months prior to the survey', a total of 4315 (currently or formerly married) and 4029 women (currently in union) were included respectively. These dependent variables were constructed from the three forms of violence namely physical, emotional and sexual. Data analysis was applied using SPSS version 23, comprising three-step process. First, descriptive statistics were used to show the frequency distribution of both the outcome and explanatory variables. Second, bivariate analysis adopting chi-square test was applied to assess the individual relationship between the outcome and explanatory variables. Third, multivariate logistic regression analysis was undertaken using hierarchical modeling strategy to identify the influence of the explanatory variables on the outcome variables. Odds ratio (OR) and 95% confidence interval (CI) were utilized to examine the association of the variables considering p-values less than 0.05 statistically significant. Results: The prevalence of lifetime IPV among ever married women was 48.4%, while 39.8% of those currently married experienced IPV in the previous year preceding the survey. Women having 1 to 4 and more than 5 number of ever born babies were almost certain to encounter lifetime IPV. However, women who own a property, and those who referenced 3-5 reasons for which wife-beating is acceptable were less probably to experience lifetime IPV. Attesting parental violence, partner’s dominant marital behavior, and women afraid of their partner were the variables related to both experience of IPV ‘ever’ and ‘the previous year prior to the survey’. Respondents who concur that wife-beating is sensible in certain situations and occupations under the professional category had diminished chances of revealing IPV in the year prior to the data collection. Conclusion: This study indicated that factors significantly correlated with IPV in Sierra-Leone are mostly linked with husband related factors specifically, marital controlling behaviors. Addressing IPV in Sierra-Leone requires joint efforts that target men raise awareness to address controlling behavior and empower security in affiliations.

Keywords: husband behavior, married women, partner violence, Sierra Leone

Procedia PDF Downloads 110
24049 A Novel Mediterranean Diet Index from the Middle East and North Africa Region: Comparison with Europe

Authors: Farah Naja, Nahla Hwalla, Leila Itani, Shirine Baalbaki, Abla Sibai, Lara Nasreddine

Abstract:

Purpose: To propose an index for assessing adherence to a Middle-Eastern version of the Mediterranean diet as represented by the traditional Lebanese Mediterranean diet (LMD), to evaluate the association between the LMD and selected European Mediterranean diets (EMD); to examine socio-demographic and lifestyle correlates of adherence to Mediterranean diet (MD) among Lebanese adults. Methods: Using nationally representative dietary intake data of Lebanese adults, an index to measure adherence to the LMD was derived. The choice of food groups used for calculating the LMD score was based on results of previous factor analyses conducted on the same dataset. These food groups included fruits, vegetables, legumes, olive oil, burghol, dairy products, starchy vegetables, dried fruits, and eggs. Using Pearson’s correlation and scores tertiles distributions agreement, the derived LMD index was compared to previously published EMD indexes from Greece, Spain, Italy, France, and EPIC. Results: Fruits, vegetables and olive oil were common denominators to all MD scores. Food groups, specific to the LMD, included burghol and dried fruits. The LMD score significantly correlated with the EMD scores, while being closest to the Italian (r=0.57) and farthest from the French (r=0.21). Percent agreement between scores’ tertile distributions and Kappa statistics confirmed these findings. Multivariate linear regression showed that older age, higher educational, female gender, and healthy lifestyle characteristics were associated with increased adherence to all MD studied. Conclusion: A novel LMD index was proposed to characterize Mediterranean diet in Lebanon, complementing international efforts to characterize the MD and its association with disease risk.

Keywords: mediterranean diet, adherence, Middle-East, Lebanon, Europe

Procedia PDF Downloads 390
24048 Quantification of Lawsone and Adulterants in Commercial Henna Products

Authors: Ruchi B. Semwal, Deepak K. Semwal, Thobile A. N. Nkosi, Alvaro M. Viljoen

Abstract:

The use of Lawsonia inermis L. (Lythraeae), commonly known as henna, has many medicinal benefits and is used as a remedy for the treatment of diarrhoea, cancer, inflammation, headache, jaundice and skin diseases in folk medicine. Although widely used for hair dyeing and temporary tattooing, henna body art has popularized over the last 15 years and changed from being a traditional bridal and festival adornment to an exotic fashion accessory. The naphthoquinone, lawsone, is one of the main constituents of the plant and responsible for its dyeing property. Henna leaves typically contain 1.8–1.9% lawsone, which is used as a marker compound for the quality control of henna products. Adulteration of henna with various toxic chemicals such as p-phenylenediamine, p-methylaminophenol, p-aminobenzene and p-toluenodiamine to produce a variety of colours, is very common and has resulted in serious health problems, including allergic reactions. This study aims to assess the quality of henna products collected from different parts of the world by determining the lawsone content, as well as the concentrations of any adulterants present. Ultra high performance liquid chromatography-mass spectrometry (UPLC-MS) was used to determine the lawsone concentrations in 172 henna products. Separation of the chemical constituents was achieved on an Acquity UPLC BEH C18 column using gradient elution (0.1% formic acid and acetonitrile). The results from UPLC-MS revealed that of 172 henna products, 11 contained 1.0-1.8% lawsone, 110 contained 0.1-0.9% lawsone, whereas 51 samples did not contain detectable levels of lawsone. High performance thin layer chromatography was investigated as a cheaper, more rapid technique for the quality control of henna in relation to the lawsone content. The samples were applied using an automatic TLC Sampler 4 (CAMAG) to pre-coated silica plates, which were subsequently developed with acetic acid, acetone and toluene (0.5: 1.0: 8.5 v/v). A Reprostar 3 digital system allowed the images to be captured. The results obtained corresponded to those from UPLC-MS analysis. Vibrational spectroscopy analysis (MIR or NIR) of the powdered henna, followed by chemometric modelling of the data, indicates that this technique shows promise as an alternative quality control method. Principal component analysis (PCA) was used to investigate the data by observing clustering and identifying outliers. Partial least squares (PLS) multivariate calibration models were constructed for the quantification of lawsone. In conclusion, only a few of the samples analysed contain lawsone in high concentrations, indicating that they are of poor quality. Currently, the presence of adulterants that may have been added to enhance the dyeing properties of the products, is being investigated.

Keywords: Lawsonia inermis, paraphenylenediamine, temporary tattooing, lawsone

Procedia PDF Downloads 438
24047 Big Data Strategy for Telco: Network Transformation

Authors: F. Amin, S. Feizi

Abstract:

Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and next-generation network, however, are more exorbitant than improved customer relationship management. Next generation of networks are in a prime position to monetize rich supplies of customer information—while being mindful of legal and privacy issues. As data assets are transformed into new revenue streams will become integral to high performance.

Keywords: big data, next generation networks, network transformation, strategy

Procedia PDF Downloads 340
24046 REDUCER: An Architectural Design Pattern for Reducing Large and Noisy Data Sets

Authors: Apkar Salatian

Abstract:

To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article, we also show how REDUCER has successfully been applied to 3 different case studies.

Keywords: design pattern, filtering, compression, architectural design

Procedia PDF Downloads 192
24045 Fuzzy Expert Systems Applied to Intelligent Design of Data Centers

Authors: Mario M. Figueroa de la Cruz, Claudia I. Solorzano, Raul Acosta, Ignacio Funes

Abstract:

This technological development project seeks to create a tool that allows companies, in need of implementing a Data Center, intelligently determining factors for allocating resources support cooling and power supply (UPS) in its conception. The results should show clearly the speed, robustness and reliability of a system designed for deployment in environments where they must manage and protect large volumes of data.

Keywords: telecommunications, data center, fuzzy logic, expert systems

Procedia PDF Downloads 327
24044 Genetic Testing and Research in South Africa: The Sharing of Data Across Borders

Authors: Amy Gooden, Meshandren Naidoo

Abstract:

Genetic research is not confined to a particular jurisdiction. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.

Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa

Procedia PDF Downloads 136
24043 Comprehensive Profiling and Characterization of Untargeted Extracellular Metabolites in Fermentation Processes: Insights and Advances in Analysis and Identification

Authors: Marianna Ciaccia, Gennaro Agrimi, Isabella Pisano, Maurizio Bettiga, Silvia Rapacioli, Giulia Mensa, Monica Marzagalli

Abstract:

Objective: Untargeted metabolomic analysis of extracellular metabolites is a powerful approach that focuses on comprehensively profiling in the extracellular space. In this study, we applied extracellular metabolomic analysis to investigate the metabolism of two probiotic microorganisms with health benefits that extend far beyond the digestive tract and the immune system. Methods: Analytical techniques employed in extracellular metabolomic analysis encompass various technologies, including mass spectrometry (MS), which enables the identification of metabolites present in the fermentation media, as well as the comparison of metabolic profiles under different experimental conditions. Multivariate statistical analysis techniques like principal component analysis (PCA) or partial least squares-discriminant analysis (PLS-DA) play a crucial role in uncovering metabolic signatures and understanding the dynamics of metabolic networks. Results: Different types of supernatants from fermentation processes, such as dairy-free, not dairy-free media and media with no cells or pasteurized, were subjected to metabolite profiling, which contained a complex mixture of metabolites, including substrates, intermediates, and end-products. This profiling provided insights into the metabolic activity of the microorganisms. The integration of advanced software tools has facilitated the identification and characterization of metabolites in different fermentation conditions and microorganism strains. Conclusions: In conclusion, untargeted extracellular metabolomic analysis, combined with software tools, allowed the study of the metabolites consumed and produced during the fermentation processes of probiotic microorganisms. Ongoing advancements in data analysis methods will further enhance the application of extracellular metabolomic analysis in fermentation research, leading to improved bioproduction and the advancement of sustainable manufacturing processes.

Keywords: biotechnology, metabolomics, lactic bacteria, probiotics, postbiotics

Procedia PDF Downloads 47
24042 Role of P53, KI67 and Cyclin a Immunohistochemical Assay in Predicting Wilms’ Tumor Mortality

Authors: Ahmed Atwa, Ashraf Hafez, Mohamed Abdelhameed, Adel Nabeeh, Mohamed Dawaba, Tamer Helmy

Abstract:

Introduction and Objective: Tumour staging and grading do not usually reflect the future behavior of Wilms' tumor (WT) regarding mortality. Therefore, in this study, P53, Ki67 and cyclin A immunohistochemistry were used in a trial to predict WT cancer-specific survival (CSS). Methods: In this nonconcurrent cohort study, patients' archived data, including age at presentation, gender, history, clinical examination and radiological investigations, were retrieved then the patients were reviewed at the outpatient clinic of a tertiary care center by history-taking, clinical examination and radiological investigations to detect the oncological outcome. Cases that received preoperative chemotherapy or died due to causes other than WT were excluded. Formalin-fixed, paraffin-embedded specimens obtained from the previously preserved blocks at the pathology laboratory were taken on positively charged slides for IHC with p53, Ki67 and cyclin A. All specimens were examined by an experienced histopathologist devoted to the urological practice and blinded to the patient's clinical findings. P53 and cyclin A staining were scored as 0 (no nuclear staining),1 (<10% nuclear staining), 2 (10-50% nuclear staining) and 3 (>50% nuclear staining). Ki67 proliferation index (PI) was graded as low, borderline and high. Results: Of the 75 cases, 40 (53.3%) were males and 35 (46.7%) were females, and the median age was 36 months (2-216). With a mean follow-up of 78.6±31 months, cancer-specific mortality (CSM) occurred in 15 (20%) and 11 (14.7%) patients, respectively. Kaplan-Meier curve was used for survival analysis, and groups were compared using the Log-rank test. Multivariate logistic regression and Cox regression were not used because only one variable (cyclin A) had shown statistical significance (P=.02), whereas the other significant factor (residual tumor) had few cases. Conclusions: Cyclin A IHC should be considered as a marker for the prediction of WT CSS. Prospective studies with a larger sample size are needed.

Keywords: wilms’ tumour, nephroblastoma, urology, survival

Procedia PDF Downloads 50
24041 Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems

Authors: Baris Can Yalcin

Abstract:

Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.

Keywords: design, mechatronics, motion sensor, data acquisition

Procedia PDF Downloads 563
24040 Determinants of Cessation of Exclusive Breastfeeding in Ankesha Guagusa Woreda, Awi Zone, Northwest Ethiopia: A Cross-Sectional Study

Authors: Tebikew Yeneabat, Tefera Belachew, Muluneh Haile

Abstract:

Background: Exclusive breast-feeding (EBF) is the practice of feeding only breast milk (including expressed breast milk) during the first six months and no other liquids and solid foods except medications. The time to cessation of exclusive breast-feeding, however, is different in different countries depending on different factors. Studies showed the risk of diarrhea morbidity and mortality is higher among none exclusive breast-feeding infants, common during starting other foods. However, there is no study that evaluated the time to cessation of exclusive breast-feeding in the study area. The aim of this study was to show time to cessation of EBF and its predictors among mothers of index infants less than twelve months old. Methods: We conducted a community-based cross-sectional study from February 13 to March 3, 2012 using both quantitative and qualitative methods. This study included a total of 592 mothers of index infant using multi-stage sampling method. Data were collected by using interviewer administered structured questionnaire. Bivariate and multivariate Cox regression analyses were performed. Results: Cessation of exclusive breast-feeding occurred in 392 (69.63%) cases. Among these, 224 (57.1%) happened before six months, while 145 (37.0%) and 23 (5.9%) occurred at six months and after six months of age of the index infant respectively. The median time for infants to stay on exclusive breast-feeding was 6.36 months in rural and 5.13 months in urban, and this difference was statistically significant on a Log rank (Cox-mantel) test. Maternal and paternal occupation, place of residence, postnatal counseling on exclusive breast-feeding, mode of delivery, and birth order of the index infant were significant predictors of cessation of exclusive breast-feeding. Conclusion: Providing postnatal care counseling on EBF, routine follow-up and support of those mothers having infants stressing for working mothers can bring about implementation of national strategy on infant and young child feeding.

Keywords: exclusive breastfeeding, cessation, median duration, Ankesha Guagusa Woreda

Procedia PDF Downloads 295
24039 Speed Characteristics of Mixed Traffic Flow on Urban Arterials

Authors: Ashish Dhamaniya, Satish Chandra

Abstract:

Speed and traffic volume data are collected on different sections of four lane and six lane roads in three metropolitan cities in India. Speed data are analyzed to fit the statistical distribution to individual vehicle speed data and all vehicles speed data. It is noted that speed data of individual vehicle generally follows a normal distribution but speed data of all vehicle combined at a section of urban road may or may not follow the normal distribution depending upon the composition of traffic stream. A new term Speed Spread Ratio (SSR) is introduced in this paper which is the ratio of difference in 85th and 50th percentile speed to the difference in 50th and 15th percentile speed. If SSR is unity then speed data are truly normally distributed. It is noted that on six lane urban roads, speed data follow a normal distribution only when SSR is in the range of 0.86 – 1.11. The range of SSR is validated on four lane roads also.

Keywords: normal distribution, percentile speed, speed spread ratio, traffic volume

Procedia PDF Downloads 394
24038 An Exploratory Analysis of Brisbane's Commuter Travel Patterns Using Smart Card Data

Authors: Ming Wei

Abstract:

Over the past two decades, Location Based Service (LBS) data have been increasingly applied to urban and transportation studies due to their comprehensiveness and consistency. However, compared to other LBS data including mobile phone data, GPS and social networking platforms, smart card data collected from public transport users have arguably yet to be fully exploited in urban systems analysis. By using five weekdays of passenger travel transaction data taken from go card – Southeast Queensland’s transit smart card – this paper analyses the spatiotemporal distribution of passenger movement with regard to the land use patterns in Brisbane. Work and residential places for public transport commuters were identified after extracting journeys-to-work patterns. Our results show that the locations of the workplaces identified from the go card data and residential suburbs are largely consistent with those that were marked in the land use map. However, the intensity for some residential locations in terms of population or commuter densities do not match well between the map and those derived from the go card data. This indicates that the misalignment between residential areas and workplaces to a certain extent, shedding light on how enhancements to service management and infrastructure expansion might be undertaken.

Keywords: big data, smart card data, travel pattern, land use

Procedia PDF Downloads 264
24037 Pattern Recognition Using Feature Based Die-Map Clustering in the Semiconductor Manufacturing Process

Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek

Abstract:

Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.

Keywords: die-map clustering, feature extraction, pattern recognition, semiconductor manufacturing process

Procedia PDF Downloads 380
24036 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 200
24035 Single-Cell Visualization with Minimum Volume Embedding

Authors: Zhenqiu Liu

Abstract:

Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.

Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method

Procedia PDF Downloads 209
24034 Cloud Data Security Using Map/Reduce Implementation of Secret Sharing Schemes

Authors: Sara Ibn El Ahrache, Tajje-eddine Rachidi, Hassan Badir, Abderrahmane Sbihi

Abstract:

Recently, there has been increasing confidence for a favorable usage of big data drawn out from the huge amount of information deposited in a cloud computing system. Data kept on such systems can be retrieved through the network at the user’s convenience. However, the data that users send include private information, and therefore, information leakage from these data is now a major social problem. The usage of secret sharing schemes for cloud computing have lately been approved to be relevant in which users deal out their data to several servers. Notably, in a (k,n) threshold scheme, data security is assured if and only if all through the whole life of the secret the opponent cannot compromise more than k of the n servers. In fact, a number of secret sharing algorithms have been suggested to deal with these security issues. In this paper, we present a Mapreduce implementation of Shamir’s secret sharing scheme to increase its performance and to achieve optimal security for cloud data. Different tests were run and through it has been demonstrated the contributions of the proposed approach. These contributions are quite considerable in terms of both security and performance.

Keywords: cloud computing, data security, Mapreduce, Shamir's secret sharing

Procedia PDF Downloads 280
24033 Science of Social Work: Recognizing Its Existence as a Scientific Discipline by a Method Triangulation

Authors: Sandra Mendes

Abstract:

Social Work has encountered over time with multivariate requests in the field of its action, provisioning frameworks of knowledge and praxis. Over the years, we have observed a transformation of society and, consequently, of the public who deals with the social work practitioners. Both, training and profession have had need to adapt and readapt the ways of doing, bailing up theories to action, while action unfolds emancipation of new theories. The theoretical questioning of this subject lies on classical authors from social sciences, and contemporary authors of Social Work. In fact, both enhance, in the design of social work, an integration and social cohesion function, creating a culture of action and theory, attributing to its method a relevant function, which shall be promoter of social changes in various dimensions of both individual and collective life, as well as scientific knowledge. On the other hand, it is assumed that Social Work, through its professionalism and through the academy, is now closer to distinguish itself from other Social Sciences as an autonomous scientific field, being, however, in the center of power struggles. This paper seeks to fill the gap in social work literature about the study of the scientific field of this area of knowledge.

Keywords: field theory, knowledge, science, social work

Procedia PDF Downloads 326
24032 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 305
24031 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction

Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto

Abstract:

Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.

Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data

Procedia PDF Downloads 84
24030 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models

Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling

Abstract:

Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.

Keywords: supplier selection, automotive supply chains, ANN, GEP

Procedia PDF Downloads 606
24029 Molecular Topology and TLC Retention Behaviour of s-Triazines: QSRR Study

Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević

Abstract:

Quantitative structure-retention relationship (QSRR) analysis was used to predict the chromatographic behavior of s-triazine derivatives by using theoretical descriptors computed from the chemical structure. Fundamental basis of the reported investigation is to relate molecular topological descriptors with chromatographic behavior of s-triazine derivatives obtained by reversed-phase (RP) thin layer chromatography (TLC) on silica gel impregnated with paraffin oil and applied ethanol-water (φ = 0.5-0.8; v/v). Retention parameter (RM0) of 14 investigated s-triazine derivatives was used as dependent variable while simple connectivity index different orders were used as independent variables. The best QSRR model for predicting RM0 value was obtained with simple third order connectivity index (3χ) in the second-degree polynomial equation. Numerical values of the correlation coefficient (r=0.915), Fisher's value (F=28.34) and root mean square error (RMSE = 0.36) indicate that model is statistically significant. In order to test the predictive power of the QSRR model leave-one-out cross-validation technique has been applied. The parameters of the internal cross-validation analysis (r2CV=0.79, r2adj=0.81, PRESS=1.89) reflect the high predictive ability of the generated model and it confirms that can be used to predict RM0 value. Multivariate classification technique, hierarchical cluster analysis (HCA), has been applied in order to group molecules according to their molecular connectivity indices. HCA is a descriptive statistical method and it is the most frequently used for important area of data processing such is classification. The HCA performed on simple molecular connectivity indices obtained from the 2D structure of investigated s-triazine compounds resulted in two main clusters in which compounds molecules were grouped according to the number of atoms in the molecule. This is in agreement with the fact that these descriptors were calculated on the basis of the number of atoms in the molecule of the investigated s-triazine derivatives.

Keywords: s-triazines, QSRR, chemometrics, chromatography, molecular descriptors

Procedia PDF Downloads 371
24028 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method

Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito

Abstract:

In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.

Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.

Procedia PDF Downloads 481
24027 Prevalence of Dengue in Sickle Cell Disease in Pre-school Children

Authors: Nikhil A. Gavhane, Sachin Shah, Ishant S. Mahajan, Pawan D. Bahekar

Abstract:

Introduction: Millions of people are affected with dengue fever every year, which drives up healthcare expenses in many low-income countries. Organ failure and other serious symptoms may result. Another worldwide public health problem is sickle cell anaemia, which is most prevalent in Africa, the Caribbean, and Europe. Dengue epidemics have reportedly occurred in locations with a high frequency of sickle cell disease, compounding the health problems in these areas. Aims and Objectives: This study examines dengue infection in sickle cell disease-afflicted pre-schoolers. Method:This Retrospective cohort study examined paediatric patients. Young people with sickle cell disease (SCD), dengue infection, and a control group without SCD or dengue were studied. Data on demographics, SCD consequences, medical treatments, and laboratory findings were gathered to analyse the influence of SCD on dengue severity and clinical outcomes, classified as severe or non-severe by the 2009 WHO classification. Using fever or admission symptoms, the research estimated acute illness duration. Result: Table 1 compares haemoglobin genotype-based dengue episode features in SS, SC, and controls. Table 2 shows that severe dengue cases are older, have longer admission delays, and have particular symptoms. Table 3's multivariate analysis indicates SS genotype's high connection with severe dengue, multiorgan failure, and acute pulmonary problems. Table 4 relates severe dengue to greater white blood cell counts, anaemia, liver enzymes, and reduced lactate dehydrogenase. Conclusion: This study is valuable but confined to hospitalised dengue patients with sickle cell illness. Small cohorts limit comparisons. Further study is needed since findings contradict predictions.

Keywords: dengue, chills, headache, severe myalgia, vomiting, nausea, prostration

Procedia PDF Downloads 50
24026 Risk Tolerance and Individual Worthiness Based on Simultaneous Analysis of the Cognitive Performance and Emotional Response to a Multivariate Situational Risk Assessment

Authors: Frederic Jumelle, Kelvin So, Didan Deng

Abstract:

A method and system for neuropsychological performance test, comprising a mobile terminal, used to interact with a cloud server which stores user information and is logged into by the user through the terminal device; the user information is directly accessed through the terminal device and is processed by artificial neural network, and the user information comprises user facial emotions information, performance test answers information and user chronometrics. This assessment is used to evaluate the cognitive performance and emotional response of the subject to a series of dichotomous questions describing various situations of daily life and challenging the users' knowledge, values, ethics, and principles. In industrial applications, the timing of this assessment will depend on the users' need to obtain a service from a provider, such as opening a bank account, getting a mortgage or an insurance policy, authenticating clearance at work, or securing online payments.

Keywords: artificial intelligence, neurofinance, neuropsychology, risk management

Procedia PDF Downloads 116
24025 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 138
24024 Job Satisfaction and Associated factors of Urban Health Extension Professionals in Addis Ababa City, Ethiopia

Authors: Metkel Gebremedhin, Biruk Kebede, Guash Abay

Abstract:

Job satisfaction largely determines the productivity and efficiency of human resources for health. There is scanty evidence on factors influencing the job satisfaction of health extension professionals (HEPs) in Addis Ababa. The objective of this study was to determine the level of and factors influencing job satisfaction among extension health workers in Addis Ababa city. This was a cross-sectional study conducted in Addis Ababa, Ethiopia. Among all public health centers found in the Addis Ababa city administration health bureau that would be included in the study, a multistage sampling technique was employed. Then we selected the study health centers randomly and urban health extension professionals from the selected health centers. In-depth interview data collection methods were carried out for a comprehensive understanding of factors affecting job satisfaction among Health extension professionals (HEPs) in Addis Ababa. HEPs working in Addis Ababa areas are the primary study population. Multivariate logistic regression with 95% CI at P ≤ 0.05 was used to assess associated factors to job satisfaction. The overall satisfaction rate was 10.7% only, while 89.3%% were dissatisfied with their jobs. The findings revealed that variables such as marital status, staff relations, community support, supervision, and rewards have a significant influence on the level of job satisfaction. For those who were not satisfied, the working environment, job description, low salary, poor leadership and training opportunities were the major causes. Other factors influencing the level of satisfaction were lack of medical equipment, lack of transport facilities, lack of training opportunities, and poor support from woreda experts. Our study documented a very low level of overall satisfaction among health extension professionals in Addis Ababa city public health centers. Considering the factors responsible for this state of affairs, urgent and concrete strategies must be developed to address the concerns of extension health professionals as they represent a sensitive domain of the health system of Addis Ababa city. Improving the overall work environment, review of job descriptions and better salaries might bring about a positive change.

Keywords: job satisfaction, extension health professionals, Addis Ababa

Procedia PDF Downloads 57
24023 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 321
24022 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 411
24021 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 386