Search results for: weighted permutation entropy (WPE)
510 Association of Genetically Proxied Cholesterol-Lowering Drug Targets and Head and Neck Cancer Survival: A Mendelian Randomization Analysis
Authors: Danni Cheng
Abstract:
Background: Preclinical and epidemiological studies have reported potential protective effects of low-density lipoprotein cholesterol (LDL-C) lowering drugs on head and neck squamous cell cancer (HNSCC) survival, but the causality was not consistent. Genetic variants associated with LDL-C lowering drug targets can predict the effects of their therapeutic inhibition on disease outcomes. Objective: We aimed to evaluate the causal association of genetically proxied cholesterol-lowering drug targets and circulating lipid traits with cancer survival in HNSCC patients stratified by human papillomavirus (HPV) status using two-sample Mendelian randomization (MR) analyses. Method: Single-nucleotide polymorphisms (SNPs) in gene region of LDL-C lowering drug targets (HMGCR, NPC1L1, CETP, PCSK9, and LDLR) associated with LDL-C levels in genome-wide association study (GWAS) from the Global Lipids Genetics Consortium (GLGC) were used to proxy LDL-C lowering drug action. SNPs proxy circulating lipids (LDL-C, HDL-C, total cholesterol, triglycerides, apoprotein A and apoprotein B) were also derived from the GLGC data. Genetic associations of these SNPs and cancer survivals were derived from 1,120 HPV-positive oropharyngeal squamous cell carcinoma (OPSCC) and 2,570 non-HPV-driven HNSCC patients in VOYAGER program. We estimated the causal associations of LDL-C lowering drugs and circulating lipids with HNSCC survival using the inverse-variance weighted method. Results: Genetically proxied HMGCR inhibition was significantly associated with worse overall survival (OS) in non-HPV-drive HNSCC patients (inverse variance-weighted hazard ratio (HR IVW), 2.64[95%CI,1.28-5.43]; P = 0.01) but better OS in HPV-positive OPSCC patients (HR IVW,0.11[95%CI,0.02-0.56]; P = 0.01). Estimates for NPC1L1 were strongly associated with worse OS in both total HNSCC (HR IVW,4.17[95%CI,1.06-16.36]; P = 0.04) and non-HPV-driven HNSCC patients (HR IVW,7.33[95%CI,1.63-32.97]; P = 0.01). A similar result was found that genetically proxied PSCK9 inhibitors were significantly associated with poor OS in non-HPV-driven HNSCC (HR IVW,1.56[95%CI,1.02 to 2.39]). Conclusion: Genetically proxied long-term HMGCR inhibition was significantly associated with decreased OS in non-HPV-driven HNSCC and increased OS in HPV-positive OPSCC. While genetically proxied NPC1L1 and PCSK9 had associations with worse OS in total and non-HPV-driven HNSCC patients. Further research is needed to understand whether these drugs have consistent associations with head and neck tumor outcomes.Keywords: Mendelian randomization analysis, head and neck cancer, cancer survival, cholesterol, statin
Procedia PDF Downloads 101509 Wildlife Habitat Corridor Mapping in Urban Environments: A GIS-Based Approach Using Preliminary Category Weightings
Authors: Stefan Peters, Phillip Roetman
Abstract:
The global loss of biodiversity is threatening the benefits nature provides to human populations and has become a more pressing issue than climate change and requires immediate attention. While there have been successful global agreements for environmental protection, such as the Montreal Protocol, these are rare, and we cannot rely on them solely. Thus, it is crucial to take national and local actions to support biodiversity. Australia is one of the 17 countries in the world with a high level of biodiversity, and its cities are vital habitats for endangered species, with more of them found in urban areas than in non-urban ones. However, the protection of biodiversity in metropolitan Adelaide has been inadequate, with over 130 species disappearing since European colonization in 1836. In this research project we conceptualized, developed and implemented a framework for wildlife Habitat Hotspots and Habitat Corridor modelling in an urban context using geographic data and GIS modelling and analysis. We used detailed topographic and other geographic data provided by a local council, including spatial and attributive properties of trees, parcels, water features, vegetated areas, roads, verges, traffic, and census data. Weighted factors considered in our raster-based Habitat Hotspot model include parcel size, parcel shape, population density, canopy cover, habitat quality and proximity to habitats and water features. Weighted factors considered in our raster-based Habitat Corridor model include habitat potential (resulting from the Habitat Hotspot model), verge size, road hierarchy, road widths, human density, and presence of remnant indigenous vegetation species. We developed a GIS model, using Python scripting and ArcGIS-Pro Model-Builder, to establish an automated reproducible and adjustable geoprocessing workflow, adaptable to any study area of interest. Our habitat hotspot and corridor modelling framework allow to determine and map existing habitat hotspots and wildlife habitat corridors. Our research had been applied to the study case of Burnside, a local council in Adelaide, Australia, which encompass an area of 30 km2. We applied end-user expertise-based category weightings to refine our models and optimize the use of our habitat map outputs towards informing local strategic decision-making.Keywords: biodiversity, GIS modeling, habitat hotspot, wildlife corridor
Procedia PDF Downloads 116508 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications
Authors: K. P. Sandesh, M. H. Suman
Abstract:
Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms
Procedia PDF Downloads 518507 Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC) within Operational Research (OR) with Sustainability and Phenomenology
Authors: Hussain Abdullah Al-Salamin, Elias Ogutu Azariah Tembe
Abstract:
Supply chain (SC) is an operational research (OR) approach and technique which acts as catalyst within central nervous system of business today. Without SC, any type of business is at doldrums, hence entropy. SC is the lifeblood of business today because it is the pivotal hub which provides imperative competitive advantage. The paper present a conceptual framework dubbed as Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC).The term homomorphic is derived from abstract algebraic mathematical term homomorphism (same shape) which also embeds the following mathematical application sets: monomorphism, isomorphism, automorphisms, and endomorphism. The HCFESC is intertwined and integrated with wide and broad sets of elements.Keywords: homomorphism, isomorphism, monomorphisms, automorphisms, epimorphisms, endomorphism, supply chain, operational research (OR)
Procedia PDF Downloads 373506 Finding Viable Pollution Routes in an Urban Network under a Predefined Cost
Authors: Dimitra Alexiou, Stefanos Katsavounis, Ria Kalfakakou
Abstract:
In an urban area the determination of transportation routes should be planned so as to minimize the provoked pollution taking into account the cost of such routes. In the sequel these routes are cited as pollution routes. The transportation network is expressed by a weighted graph G= (V, E, D, P) where every vertex represents a location to be served and E contains unordered pairs (edges) of elements in V that indicate a simple road. The distances/cost and a weight that depict the provoked air pollution by a vehicle transition at every road are assigned to each road as well. These are the items of set D and P respectively. Furthermore the investigated pollution routes must not exceed predefined corresponding values concerning the route cost and the route pollution level during the vehicle transition. In this paper we present an algorithm that generates such routes in order that the decision maker selects the most appropriate one.Keywords: bi-criteria, pollution, shortest paths, computation
Procedia PDF Downloads 375505 Currency Exchange Rate Forecasts Using Quantile Regression
Authors: Yuzhi Cai
Abstract:
In this paper, we discuss a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. Together with a combining forecasts technique, we then predict USD to GBP currency exchange rates. Combined forecasts contain all the information captured by the fitted QAR models at different quantile levels and are therefore better than those obtained from individual models. Our results show that an unequally weighted combining method performs better than other forecasting methodology. We found that a median AR model can perform well in point forecasting when the predictive density functions are symmetric. However, in practice, using the median AR model alone may involve the loss of information about the data captured by other QAR models. We recommend that combined forecasts should be used whenever possible.Keywords: combining forecasts, MCMC, predictive density functions, quantile forecasting, quantile modelling
Procedia PDF Downloads 256504 Modeling Driving Distraction Considering Psychological-Physical Constraints
Authors: Yixin Zhu, Lishengsa Yue, Jian Sun, Lanyue Tang
Abstract:
Modeling driving distraction in microscopic traffic simulation is crucial for enhancing simulation accuracy. Current driving distraction models are mainly derived from physical motion constraints under distracted states, in which distraction-related error terms are added to existing microscopic driver models. However, the model accuracy is not very satisfying, due to a lack of modeling the cognitive mechanism underlying the distraction. This study models driving distraction based on the Queueing Network Human Processor model (QN-MHP). This study utilizes the queuing structure of the model to perform task invocation and switching for distracted operation and control of the vehicle under driver distraction. Based on the assumption of the QN-MHP model about the cognitive sub-network, server F is a structural bottleneck. The latter information must wait for the previous information to leave server F before it can be processed in server F. Therefore, the waiting time for task switching needs to be calculated. Since the QN-MHP model has different information processing paths for auditory information and visual information, this study divides driving distraction into two types: auditory distraction and visual distraction. For visual distraction, both the visual distraction task and the driving task need to go through the visual perception sub-network, and the stimuli of the two are asynchronous, which is called stimulus on asynchrony (SOA), so when calculating the waiting time for switching tasks, it is necessary to consider it. In the case of auditory distraction, the auditory distraction task and the driving task do not need to compete for the server resources of the perceptual sub-network, and their stimuli can be synchronized without considering the time difference in receiving the stimuli. According to the Theory of Planned Behavior for drivers (TPB), this study uses risk entropy as the decision criterion for driver task switching. A logistic regression model is used with risk entropy as the independent variable to determine whether the driver performs a distraction task, to explain the relationship between perceived risk and distraction. Furthermore, to model a driver’s perception characteristics, a neurophysiological model of visual distraction tasks is incorporated into the QN-MHP, and executes the classical Intelligent Driver Model. The proposed driving distraction model integrates the psychological cognitive process of a driver with the physical motion characteristics, resulting in both high accuracy and interpretability. This paper uses 773 segments of distracted car-following in Shanghai Naturalistic Driving Study data (SH-NDS) to classify the patterns of distracted behavior on different road facilities and obtains three types of distraction patterns: numbness, delay, and aggressiveness. The model was calibrated and verified by simulation. The results indicate that the model can effectively simulate the distracted car-following behavior of different patterns on various roadway facilities, and its performance is better than the traditional IDM model with distraction-related error terms. The proposed model overcomes the limitations of physical-constraints-based models in replicating dangerous driving behaviors, and internal characteristics of an individual. Moreover, the model is demonstrated to effectively generate more dangerous distracted driving scenarios, which can be used to construct high-value automated driving test scenarios.Keywords: computational cognitive model, driving distraction, microscopic traffic simulation, psychological-physical constraints
Procedia PDF Downloads 92503 Robust Noisy Speech Identification Using Frame Classifier Derived Features
Authors: Punnoose A. K.
Abstract:
This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering
Procedia PDF Downloads 128502 Turkish Validation of the Nursing Outcomes for Urinary Incontinence and Their Sensitivities on Nursing Interventions
Authors: Dercan Gencbas, Hatice Bebis, Sue Moorhead
Abstract:
In the nursing process, many of the nursing classification systems were created to be used in international. From these, NANDA-I, Nursing Outcomes Classification (NOC) and Nursing Interventions Classification (NIC). In this direction, the main objective of this study is to establish a model for caregivers in hospitals and communities in Turkey and to ensure that nursing outputs are assessed by NOC-based measures. There are many scales to measure Urinary Incontinence (UI), which is very common in children, in old age, vaginal birth, NOC scales are ideal for use in the nursing process for comprehensive and holistic assessment, with surveys available. For this reason, the purpose of this study is to evaluate the validity of the NOC outputs and indicators used for UI NANDA-I. This research is a methodological study. In addition to the validity of scale indicators in the study, how much they will contribute to recovery after the nursing intervention was assessed by experts. Scope validations have been applied and calculated according to Fehring 1987 work model. According to this, nursing inclusion criteria and scores were determined. For example, if experts have at least four years of clinical experience, their score was 4 points or have at least one year of the nursing classification system, their score was 1 point. The experts were a publication experience about nursing classification, their score was 1 point, or have a doctoral degree in nursing, their score was 2 points. If the expert has a master degree, their score was 1 point. Total of 55 experts rated Fehring as a “senior degree” with a score of 90 according to the expert scoring. The nursing interventions to be applied were asked to what extent these indicators would contribute to recovery. For coverage validity tailored to Fehring's model, each NOC and NOC indicator from specialists was asked to score between 1-5. Score for the significance of indicators was from 1=no precaution to 5=very important. After the expert opinion, these weighted scores obtained for each NOC and NOC indicator were classified as 0.8 critical, 0.8 > 0.5 complements, > 0.5 are excluded. In the NANDA-I / NOC / NIC system (guideline), 5 NOCs proposed for nursing diagnoses for UI were proposed. These outputs are; Urinary Continence, Urinary Elimination, Tissue Integrity, Self CareToileting, Medication Response. After the scales are translated into Turkish, the weighted average of the scores obtained from specialists for the coverage of all 5 NOCs and the contribution of nursing initiatives exceeded 0.8. After the opinions of the experts, 79 of the 82 indicators were calculated as critical, 3 of the indicators were calculated as supplemental. Because of 0.5 > was not obtained, no substance was removed. All NOC outputs were identified as valid and usable scales in Turkey. In this study, five NOC outcomes were verified for the evaluation of the output of individuals who have received nursing knowledge of UI and variant types. Nurses in Turkey can benefit from the outputs of the NOC scale to perform the care of the elderly incontinence.Keywords: nursing outcomes, content validity, nursing diagnosis, urinary incontinence
Procedia PDF Downloads 125501 The Analysis of Different Classes of Weighted Fuzzy Petri Nets and Their Features
Authors: Yurii Bloshko, Oksana Olar
Abstract:
This paper presents the analysis of 6 different classes of Petri nets: fuzzy Petri nets (FPN), generalized fuzzy Petri nets (GFPN), parameterized fuzzy Petri nets (PFPN), T2GFPN, flexible generalized fuzzy Petri nets (FGFPN), binary Petri nets (BPN). These classes were simulated in the special software PNeS® for the analysis of its pros and cons on the example of models which are dedicated to the decision-making process of passenger transport logistics. The paper includes the analysis of two approaches: when input values are filled with the experts’ knowledge; when fuzzy expectations represented by output values are added to the point. These approaches fulfill the possibilities of triples of functions which are replaced with different combinations of t-/s-norms.Keywords: fuzzy petri net, intelligent computational techniques, knowledge representation, triangular norms
Procedia PDF Downloads 143500 Association Between Swallowing Disorders and Cognitive Disorders in Adults: Systematic Review and Metaanalysis
Authors: Shiva Ebrahimian Dehaghani, Afsaneh Doosti, Morteza Zare
Abstract:
Background: There is no consensus regarding the association between dysphagia and cognition. Purpose: The aim of this study was to quantitatively and qualitatively analyze the available evidence on the direction and strength of association between dysphagia and cognition. Methodology: PubMed, Scopus, Embase and Web of Science were searched about the association between dysphagia and cognition. A random-effects model was used to determine weighted odds ratios (OR) and 95% confidence intervals (CI). Sensitivity analysis was performed to determine the impact of each individual study on the pooled results. Results: A total of 1427 participants showed that some cognitive disorders were significantly associated with dysphagia (OR = 3.23; 95% CI, 2.33–4.48). Conclusion: The association between cognition and swallowing disorders suggests that multiple neuroanatomical systems are involved in these two functions.Keywords: adult, association, cognitive impairment, dysphagia, systematic review
Procedia PDF Downloads 161499 Beyond Geometry: The Importance of Surface Properties in Space Syntax Research
Authors: Christoph Opperer
Abstract:
Space syntax is a theory and method for analyzing the spatial layout of buildings and urban environments to understand how they can influence patterns of human movement, social interaction, and behavior. While direct visibility is a key factor in space syntax research, important visual information such as light, color, texture, etc., are typically not considered, even though psychological studies have shown a strong correlation to the human perceptual experience within physical space – with light and color, for example, playing a crucial role in shaping the perception of spaciousness. Furthermore, these surface properties are often the visual features that are most salient and responsible for drawing attention to certain elements within the environment. This paper explores the potential of integrating these factors into general space syntax methods and visibility-based analysis of space, particularly for architectural spatial layouts. To this end, we use a combination of geometric (isovist) and topological (visibility graph) approaches together with image-based methods, allowing a comprehensive exploration of the relationship between spatial geometry, visual aesthetics, and human experience. Custom-coded ray-tracing techniques are employed to generate spherical panorama images, encoding three-dimensional spatial data in the form of two-dimensional images. These images are then processed through computer vision algorithms to generate saliency-maps, which serve as a visual representation of areas most likely to attract human attention based on their visual properties. The maps are subsequently used to weight the vertices of isovists and the visibility graph, placing greater emphasis on areas with high saliency. Compared to traditional methods, our weighted visibility analysis introduces an additional layer of information density by assigning different weights or importance levels to various aspects within the field of view. This extends general space syntax measures to provide a more nuanced understanding of visibility patterns that better reflect the dynamics of human attention and perception. Furthermore, by drawing parallels to traditional isovist and VGA analysis, our weighted approach emphasizes a crucial distinction, which has been pointed out by Ervin and Steinitz: the difference between what is possible to see and what is likely to be seen. Therefore, this paper emphasizes the importance of including surface properties in visibility-based analysis to gain deeper insights into how people interact with their surroundings and to establish a stronger connection with human attention and perception.Keywords: space syntax, visibility analysis, isovist, visibility graph, visual features, human perception, saliency detection, raytracing, spherical images
Procedia PDF Downloads 77498 A Survey on Lossless Compression of Bayer Color Filter Array Images
Authors: Alina Trifan, António J. R. Neves
Abstract:
Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.Keywords: bayer image, CFA, lossless compression, image coding standards
Procedia PDF Downloads 322497 Biologically Inspired Small Infrared Target Detection Using Local Contrast Mechanisms
Authors: Tian Xia, Yuan Yan Tang
Abstract:
In order to obtain higher small target detection accuracy, this paper presents an effective algorithm inspired by the local contrast mechanism. The proposed method can enhance target signal and suppress background clutter simultaneously. In the first stage, a enhanced image is obtained using the proposed Weighted Laplacian of Gaussian. In the second stage, an adaptive threshold is adopted to segment the target. Experimental results on two changeling image sequences show that the proposed method can detect the bright and dark targets simultaneously, and is not sensitive to sea-sky line of the infrared image. So it is fit for IR small infrared target detection.Keywords: small target detection, local contrast, human vision system, Laplacian of Gaussian
Procedia PDF Downloads 469496 Performance Study of Cascade Refrigeration System Using Alternative Refrigerants
Authors: Gulshan Sachdeva, Vaibhav Jain, S. S. Kachhwaha
Abstract:
Cascade refrigeration systems employ series of single stage vapor compression units which are thermally coupled with evaporator/condenser cascades. Different refrigerants are used in each of the circuit depending on the optimum characteristics shown by the refrigerant for a particular application. In the present research study, a steady state thermodynamic model is developed which simulates the working of an actual cascade system. The model provides COP and all other system parameters like total compressor work, temperature, pressure, enthalpy and entropy at different state points. The working fluid in Low Temperature Circuit (LTC) is CO2 (R744) while ammonia (R717), propane (R290), propylene (R1270), R404A and R12 are the refrigerants in High Temperature Circuit (HTC). The performance curves of ammonia, propane, propylene, and R404A are compared with R12 to find its nearest substitute. Results show that ammonia is the best substitute of R12.Keywords: cascade system, refrigerants, thermodynamic model, production engineering
Procedia PDF Downloads 361495 Quantitative Comparisons of Different Approaches for Rotor Identification
Authors: Elizabeth M. Annoni, Elena G. Tolkacheva
Abstract:
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors
Procedia PDF Downloads 324494 Metric Suite for Schema Evolution of a Relational Database
Authors: S. Ravichandra, D. V. L. N. Somayajulu
Abstract:
Requirement of stakeholders for adding more details to the database is the main cause of the schema evolution in the relational database. Further, this schema evolution causes the instability to the database. Hence, it is aimed to define a metric suite for schema evolution of a relational database. The metric suite will calculate the metrics based on the features of the database, analyse the queries on the database and measures the coupling, cohesion and component dependencies of the schema for existing and evolved versions of the database. This metric suite will also provide an indicator for the problems related to the stability and usability of the evolved database. The degree of change in the schema of a database is presented in the forms of graphs that acts as an indicator and also provides the relations between various parameters (metrics) related to the database architecture. The acquired information is used to defend and improve the stability of database architecture. The challenges arise in incorporating these metrics with varying parameters for formulating a suitable metric suite are discussed. To validate the proposed metric suite, an experimentation has been performed on publicly available datasets.Keywords: cohesion, coupling, entropy, metric suite, schema evolution
Procedia PDF Downloads 451493 On Generalized Cumulative Past Inaccuracy Measure for Marginal and Conditional Lifetimes
Authors: Amit Ghosh, Chanchal Kundu
Abstract:
Recently, the notion of past cumulative inaccuracy (CPI) measure has been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order α (alpha) and study the proposed measure for conditionally specified models of two components failed at different time instants called generalized conditional CPI (GCCPI). We provide some bounds using usual stochastic order and investigate several properties of GCCPI. The effect of monotone transformation on this proposed measure has also been examined. Furthermore, we characterize some bivariate distributions under the assumption of conditional proportional reversed hazard rate model. Moreover, the role of GCCPI in reliability modeling has also been investigated for a real-life problem.Keywords: cumulative past inaccuracy, marginal and conditional past lifetimes, conditional proportional reversed hazard rate model, usual stochastic order
Procedia PDF Downloads 254492 The Association between Affective States and Sexual/Health-Related Status among Men Who Have Sex with Men in China: An Exploration Study Using Social Media Data
Authors: Zhi-Wei Zheng, Zhong-Qi Liu, Jia-Ling Qiu, Shan-Qing Guo, Zhong-Wei Jia, Chun Hao
Abstract:
Objectives: The purpose of this study was to understand and examine the association between diurnal mood variation and sexual/health-related status among men who have sex with men (MSM) using data from MSM Chinese Twitter messages. The study consists of 843,745 postings of 377,610 MSM users located in Guangdong that were culled from the MSM Chinese Twitter App. Positive affect, negative affect, sexual related behaviors, and health-related status were measured using the Simplified Chinese Linguistic Inquiry and Word Count. Emotions, including joy, sadness, anger, fear, and disgust were measured using the Weibo Basic Mood Lexicon. A positive sentiment score and a positive emotions score were also calculated. Linear regression models based on a permutation test were used to assess associations between affective states and sexual/health-related status. In the results, 5,871 active MSM users and their 477,374 postings were finally selected. MSM expressed positive affect and joy at 8 a.m. and expressed negative affect and negative emotions between 2 a.m. and 4 a.m. In addition, 25.1% of negative postings were directly related to health and 13.4% reported seeking social support during that sensitive period. MSM who were senior, educated, overweight or obese, self-identified as performing a versatile sex role, and with less followers, more followers, and less chat groups mainly expressed more negative affect and negative emotions. MSM who talked more about sexual-related behaviors had a higher positive sentiment score (β=0.29, p < 0.001) and a higher positive emotions score (β = 0.16, p < 0.001). MSM who reported more on their health status had a lower positive sentiment score (β = -0.83, p < 0.001) and a lower positive emotions score (β = -0.37, p < 0.001). The study concluded that psychological intervention based on an app for MSM should be conducted, as it may improve mental health.Keywords: affect, men who have sex with men, sexual related behavior, health-related status, social media
Procedia PDF Downloads 163491 Wrist Pain, Technological Device Used, and Perceived Academic Performance Among the College of Computer Studies Students
Authors: Maquiling Jhuvie Jane R., Ojastro Regine B., Peroja Loreille Marie B., Pinili Joy Angela., Salve Genial Gail M., Villavicencio Marielle Irene B., Yap Alther Francis Garth B.
Abstract:
Introduction: This study investigated the impact of prolonged device usage on wrist pain and perceived academic performance among college students in Computer Studies. The research aims to explore the correlation between the frequency of technological device use and the incidence of wrist pain, as well as how this pain affects students' academic performance. The study seeks to provide insights that could inform interventions to promote better musculoskeletal health among students engaged in intensive technology use to further improve their academic performance. Method: The study utilized descriptive-correlational and comparative design, focusing on bona fide students from Silliman University’s College of Computer Studies during the second semester of 2023-2024. Participants were recruited through a survey sent via school email, with responses collected until March 30, 2024. Data was gathered using a password-protected device and Google Forms, ensuring restricted access to raw data. The demographic profile was summarized, and the prevalence of wrist pain and device usage were analyzed using percentages and weighted means. Statistical analyses included Spearman’s rank correlation coefficient to assess the relationship between wrist pain and device usage and an Independent T-test to evaluate differences in academic performance based on wrist pain presence. Alpha was set at 0.05. Results: The study revealed that 40% of College of Computer Studies students experience wrist pain, with 2 out of every 5 students affected. Laptops and desktops were the most frequently used devices for academic work, achieving a weighted mean of 4.511, while mobile phones and tablets received lower means of 4.183 and 1.911, respectively. The average academic performance score among students was 29.7, classified as ‘Good Performance.’ Notably, there was no significant relationship between the frequency of device usage and wrist pain, as indicated by p-values exceeding 0.05. However, a significant difference in perceived academic performance was observed, with students without wrist pain scoring an average of 30.39 compared to 28.72 for those with wrist pain and a p-value of 0.0134 confirming this distinction. Conclusion: The study revealed that about 40% of students in the College of Computer Studies experience wrist pain, but there is no significant link between device usage and pain occurrence. However, students without wrist pain demonstrated better academic performance than those with pain, suggesting that wrist health may impact academic success. These findings imply that physical therapy practices in the Philippines should focus on preventive strategies and ergonomic education to improve student health and performance.Keywords: wrist pain, frequency of use of technological devices, perceived academic performance, physical therapy
Procedia PDF Downloads 16490 Eco-Index for Assessing Ecological Disturbances at Downstream of a Hydropower Project
Authors: Chandra Upadhyaya, Arup Kumar Sarma
Abstract:
In the North Eastern part of India several hydro power projects are being proposed and execution for some of them are already initiated. There are controversies surrounding these constructions. Impact of these dams in the downstream part of the rivers needs to be assessed so that eco-system and people living downstream are protected by redesigning the projects if it becomes necessary. This may result in reducing the stresses to the affected ecosystem and people living downstream. At present many index based ecological methods are present to assess impact on ecology. However, none of these methods are capable of assessing the affect resulting from dam induced diurnal variation of flow in the downstream. We need environmental flow methodology based on hydrological index which can address the affect resulting from dam induced diurnal variation of flow and play an important role in a riverine ecosystem management and be able to provide a qualitative idea about changes in the habitat for aquatic and riparian species.Keywords: ecosystem, environmental flow assessment, entropy, IHA, TNC
Procedia PDF Downloads 385489 Controversies and Contradiction in (IR) Reversibility and the Equilibrium of Reactive Systems
Authors: Joao Teotonio Manzi
Abstract:
Reversibility, irreversibility, equilibrium and steady-state that play a central role in the thermodynamic analysis of processes arising in the context of reactive systems are discussed in this article. Such concepts have generated substantial doubts, even among the most experienced researchers, and engineers, because from the literature, conclusive or definitive statements cannot be extracted. Concepts such as the time-reversibility of irreversible processes seem paradoxical, requiring further analysis. Equilibrium and reversibility, which appear to be of the same nature, have also been re-examined in the light of maximum entropy. The goal of this paper is to revisit and explore these concepts based on classical thermodynamics in order to have a better understanding them due to their impacts on technological advances, as a result, to generate an optimal procedure for designing, monitoring, and engineering optimization. Furthermore, an effective graphic procedure for dimensioning a Plug Flow Reactor has been provided. Thus, to meet the needs of chemical engineering from a simple conceptual analysis but with significant practical effects, a macroscopic approach is taken so as to integrate the different parts of this paper.Keywords: reversibility, equilibrium, steady-state, thermodynamics, reactive system
Procedia PDF Downloads 106488 Contextual Factors of Innovation for Improving Commercial Banks' Performance in Nigeria
Authors: Tomola Obamuyi
Abstract:
The banking system in Nigeria adopted innovative banking, with the aim of enhancing financial inclusion, and making financial services readily and cheaply available to majority of the people, and to contribute to the efficiency of the financial system. Some of the innovative services include: Automatic Teller Machines (ATMs), National Electronic Fund Transfer (NEFT), Point of Sale (PoS), internet (Web) banking, Mobile Money payment (MMO), Real-Time Gross Settlement (RTGS), agent banking, among others. The introduction of these payment systems is expected to increase bank efficiency and customers' satisfaction, culminating in better performance for the commercial banks. However, opinions differ on the possible effects of the various innovative payment systems on the performance of commercial banks in the country. Thus, this study empirically determines how commercial banks use innovation to gain competitive advantage in the specific context of Nigeria's finance and business. The study also analyses the effects of financial innovation on the performance of commercial banks, when different periods of analysis are considered. The study employed secondary data from 2009 to 2018, the period that witnessed aggressive innovation in the financial sector of the country. The Vector Autoregression (VAR) estimation technique forecasts the relative variance of each random innovation to the variables in the VAR, examine the effect of standard deviation shock to one of the innovations on current and future values of the impulse response and determine the causal relationship between the variables (VAR granger causality test). The study also employed the Multi-Criteria Decision Making (MCDM) to rank the innovations and the performance criteria of Return on Assets (ROA) and Return on Equity (ROE). The entropy method of MCDM was used to determine which of the performance criteria better reflect the contributions of the various innovations in the banking sector. On the other hand, the Range of Values (ROV) method was used to rank the contributions of the seven innovations to performance. The analysis was done based on medium term (five years) and long run (ten years) of innovations in the sector. The impulse response function derived from the VAR system indicated that the response of ROA to the values of cheques transaction, values of NEFT transactions, values of POS transactions was positive and significant in the periods of analysis. The paper also confirmed with entropy and range of value that, in the long run, both the CHEQUE and MMO performed best while NEFT was next in performance. The paper concluded that commercial banks would enhance their performance by continuously improving on the services provided through Cheques, National Electronic Fund Transfer and Point of Sale since these instruments have long run effects on their performance. This will increase the confidence of the populace and encourage more usage/patronage of these services. The banking sector will in turn experience better performance which will improve the economy of the country. Keywords: Bank performance, financial innovation, multi-criteria decision making, vector autoregression,Keywords: Bank performance, financial innovation, multi-criteria decision making, vector autoregression
Procedia PDF Downloads 121487 A Network-Theorical Perspective on Music Analysis
Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria
Abstract:
The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.Keywords: computational musicology, mathematical music modelling, music analysis, style classification
Procedia PDF Downloads 104486 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 167485 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms
Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma
Abstract:
Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.Keywords: image fusion, pyramid, wavelets, principal component analysis
Procedia PDF Downloads 284484 Dimension Free Rigid Point Set Registration in Linear Time
Authors: Jianqin Qu
Abstract:
This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.Keywords: covariant point, point matching, dimension free, rigid registration
Procedia PDF Downloads 168483 Synthesis and Characterization of Thiourea-Formaldehyde Coated Fe3O4 (TUF@Fe3O4) and Its Application for Adsorption of Methylene Blue
Authors: Saad M. Alshehri, Tansir Ahamad
Abstract:
Thiourea-Formaldehyde Pre-Polymer (TUF) was prepared by the reaction thiourea and formaldehyde in basic medium and used as a coating materials for magnetite Fe3O4. The synthesized polymer coated microspheres (TUF@Fe3O4) was characterized using FTIR, TGA SEM and TEM. Its BET surface area was up to 1680 m2 g_1. The adsorption capacity of this ACF product was evaluated in its adsorption of Methylene Blue (MB) in water under different pH values and different temperature. We found that the adsorption process was well described both by the Langmuir and Freundlich isotherm model. The kinetic processes of MB adsorption onto TUF@Fe3O4 were described in order to provide a more clear interpretation of the adsorption rate and uptake mechanism. The overall kinetic data was acceptably explained by a pseudo second-order rate model. Evaluated ∆Go and ∆Ho specify the spontaneous and exothermic nature of the reaction. The adsorption takes place with a decrease in entropy (∆So is negative). The monolayer capacity for MB was up to 450 mg g_1 and was one of the highest among similar polymeric products. It was due to its large BET surface area.Keywords: TGA, FTIR, magentite, thiourea formaldehyde resin, methylene blue, adsorption
Procedia PDF Downloads 351482 Thermodynamics of Stable Micro Black Holes Production by Modeling from the LHC
Authors: Aref Yazdani, Ali Tofighi
Abstract:
We study a simulative model for production of stable micro black holes based on investigation on thermodynamics of LHC experiment. We show that how this production can be achieved through a thermodynamic process of stability. Indeed, this process can be done through a very small amount of powerful fuel. By applying the second law of black hole thermodynamics at the scale of quantum gravity and perturbation expansion of the given entropy function, a time-dependent potential function is obtained which is illustrated with exact numerical values in higher dimensions. Seeking for the conditions for stability of micro black holes is another purpose of this study. This is proven through an injection method of putting the exact amount of energy into the final phase of the production which is equivalent to the same energy injection into the center of collision at the LHC in order to stabilize the produced particles. Injection of energy into the center of collision at the LHC is a new pattern that it is worth a try for the first time.Keywords: micro black holes, LHC experiment, black holes thermodynamics, extra dimensions model
Procedia PDF Downloads 144481 Towards Establishing a Universal Theory of Project Management
Authors: Divine Kwaku Ahadzie
Abstract:
Project management (PM) as a concept has evolved from the early 20th Century into a recognized academic and professional discipline, and indications are that it has come to stay in the 21st Century as a world-wide paradigm shift for managing successful construction projects. However, notwithstanding the strong inroads that PM has made in legitimizing its academic and professional status in construction management practice, the underlining philosophies are still based on cases and conventional practices. An important theoretical issue yet to be addressed is the lack of a universal theory that offers philosophical legitimacy for the PM concept as a uniquely specialized management concept. Here, it is hypothesized that the law of entropy, the theory of uncertainties and the theory of risk management offer plausible explanations for addressing the lacuna of what constitute PM theory. The theoretical bases of these plausible underlying theories are argued and attempts made to establish the functional relationships that exist between these theories and the PM concept. The paper then draws on data related to the success and/or failure of a number of construction projects to validate the theory.Keywords: concepts, construction, project management, universal theory
Procedia PDF Downloads 328