Search results for: λ-levelwise statistical convergence
3633 Controlled Shock Response Spectrum Test on Spacecraft Subsystem Using Electrodynamic Shaker
Authors: M. Madheswaran, A. R. Prashant, S. Ramakrishna, V. Ramesh Naidu, P. Govindan, P. Aravindakshan
Abstract:
Shock Response spectrum (SRS) tests are one of the tests that are conducted on some critical systems of spacecraft as part of environmental testing. The SRS tests are conducted to simulate the pyro shocks that occur during launch phases as well as during deployment of spacecraft appendages. Some of the methods to carryout SRS tests are pyro technique method, impact hammer method, drop shock method and using electro dynamic shakers. The pyro technique, impact hammer and drop shock methods are open loop tests, whereas SRS testing using electrodynamic shaker is a controlled closed loop test. SRS testing using electrodynamic shaker offers various advantages such as simple test set up, better controllability and repeatability. However, it is important to devise a a proper test methodology so that safety of the electro dynamic shaker and that of test specimen are not compromised. This paper discusses the challenges that are involved in conducting SRS tests, shaker validation and the necessary precautions to be considered. Approach involved in choosing various test parameters like synthesis waveform, spectrum convergence level, etc., are discussed. A case study of SRS test conducted on an optical payload of Indian Geo stationary spacecraft is presented.Keywords: maxi-max spectrum, SRS (shock response spectrum), SDOf (single degree of freedom), wavelet synthesis
Procedia PDF Downloads 3603632 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli
Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha
Abstract:
Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus
Procedia PDF Downloads 2833631 Evaluation of IMERG Performance at Estimating the Rainfall Properties through Convective and Stratiform Rain Events in a Semi-Arid Region of Mexico
Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez
Abstract:
Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention. However, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem are the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurament final run V06B SPP in a semi-arid region of Mexico, using 4 automatic rain gauges (pluviographs) sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs. for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing large overestimations and underestimations, respectively. The study zone presented 80 to 85 % of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and pluviographs. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation
Procedia PDF Downloads 693630 Optimization of Ultrasound-Assisted Extraction of Oil from Spent Coffee Grounds Using a Central Composite Rotatable Design
Authors: Malek Miladi, Miguel Vegara, Maria Perez-Infantes, Khaled Mohamed Ramadan, Antonio Ruiz-Canales, Damaris Nunez-Gomez
Abstract:
Coffee is the second consumed commodity worldwide, yet it also generates colossal waste. Proper management of coffee waste is proposed by converting them into products with higher added value to achieve sustainability of the economic and ecological footprint and protect the environment. Based on this, a study looking at the recovery of coffee waste is becoming more relevant in recent decades. Spent coffee grounds (SCG's) resulted from brewing coffee represents the major waste produced among all coffee industry. The fact that SCGs has no economic value be abundant in nature and industry, do not compete with agriculture and especially its high oil content (between 7-15% from its total dry matter weight depending on the coffee varieties, Arabica or Robusta), encourages its use as a sustainable feedstock for bio-oil production. The bio-oil extraction is a crucial step towards biodiesel production by the transesterification process. However, conventional methods used for oil extraction are not recommended due to their high consumption of energy, time, and generation of toxic volatile organic solvents. Thus, finding a sustainable, economical, and efficient extraction technique is crucial to scale up the process and to ensure more environment-friendly production. Under this perspective, the aim of this work was the statistical study to know an efficient strategy for oil extraction by n-hexane using indirect sonication. The coffee waste mixed Arabica and Robusta, which was used in this work. The temperature effect, sonication time, and solvent-to-solid ratio on the oil yield were statistically investigated as dependent variables by Central Composite Rotatable Design (CCRD) 23. The results were analyzed using STATISTICA 7 StatSoft software. The CCRD showed the significance of all the variables tested (P < 0.05) on the process output. The validation of the model by analysis of variance (ANOVA) showed good adjustment for the results obtained for a 95% confidence interval, and also, the predicted values graph vs. experimental values confirmed the satisfactory correlation between the model results. Besides, the identification of the optimum experimental conditions was based on the study of the surface response graphs (2-D and 3-D) and the critical statistical values. Based on the CCDR results, 29 ºC, 56.6 min, and solvent-to-solid ratio 16 were the better experimental conditions defined statistically for coffee waste oil extraction using n-hexane as solvent. In these conditions, the oil yield was >9% in all cases. The results confirmed the efficiency of using an ultrasound bath in extracting oil as a more economical, green, and efficient way when compared to the Soxhlet method.Keywords: coffee waste, optimization, oil yield, statistical planning
Procedia PDF Downloads 1193629 Evidence of Climate Change from Statistical Analysis of Temperature and Rainfall Data of Kaduna State, Nigeria
Authors: Iliya Bitrus Abaje
Abstract:
This study examines the evidence of climate change scenario in Kaduna State from the analysis of temperature and rainfall data (1976-2015) from three meteorological stations along a geographic transect from the southern part to the northern part of the State. Different statistical methods were used in determining the changes in both the temperature and rainfall series. The result of the linear trend lines revealed a mean increase in average temperature of 0.73oC for the 40 years period of study in the State. The plotted standard deviation for the temperature anomalies generally revealed that years of temperatures above the mean standard deviation (hotter than the normal conditions) in the last two decades (1996-2005 and 2006-2015) were more than those below (colder than the normal condition). The Cramer’s test and student’s t-test generally revealed an increasing temperature trend in the recent decades. The increased in temperature is an evidence that the earth’s atmosphere is getting warmer in recent years. The linear trend line equation of the annual rainfall for the period of study showed a mean increase of 316.25 mm for the State. Findings also revealed that the plotted standard deviation for the rainfall anomalies, and the 10-year non-overlapping and 30-year overlapping sub-periods analysis in all the three stations generally showed an increasing trend from the beginning of the data to the recent years. This is an evidence that the study area is now experiencing wetter conditions in recent years and hence climate change. The study recommends diversification of the economic base of the populace with emphasis on moving away from activities that are sensitive to temperature and rainfall extremes Also, appropriate strategies to ameliorate the scourge of climate change at all levels/sectors should always take into account the recent changes in temperature and rainfall amount in the area.Keywords: anomalies, linear trend, rainfall, temperature
Procedia PDF Downloads 3183628 Analyzing Oil Seeps Manifestations and Petroleum Impregnation in Northwestern Tunisia From Aliphatic Biomarkers and Statistical Data
Authors: Sawsen Jarray, Tahani Hallek, Mabrouk Montacer
Abstract:
The tectonically damaged terrain in Tunisia's Northwest is seen in the country's numerous oil leaks. Finding a genetic link between these oil seeps and the area's putative source rocks is the goal of this investigation. Here, we use aliphatic biomarkers assessed by GC-MS to describe the organic geochemical data of 18 oil seeps samples and 4 source rocks (M'Cherga, Fahdene, Bahloul, and BouDabbous). In order to establish correlations between oil and oil and oil and source rock, terpanes, hopanes, and steranes biomarkers were identified. The source rocks under study were deposited in a marine environment and were suboxic, with minor signs of continental input for the M'Cherga Formation. There is no connection between the Fahdene and Bahloul source rocks and the udied oil seeps. According to the biomarkers C27 18-22,29,30trisnorneohopane (Ts) and C27 17-22,29,30-trisnorhopane (Tm), these source rocks are mature and have reached the oil window. Regarding oil seeps, geochemical data indicate that, with the exception of four samples that showed some continental markings, the bulk of samples were deposited in an open marine environment. These most recent samples from oil seeps have a unique lithology (marl) that distinguishes them from the others (carbonate). There are two classes of oil seeps, according to statistical analysis of relationships between oil and oil and oil and source rocks. The first comprised samples that showed a positive connection with carbonate-lithological and marine-derived BouDabbous black shales. The second is a result of M'Cherga source rock and is made up of oil seeps with remnants of the terrestrial environment and a lithology with a marl trend. The Fahdene and Bahloul source rocks have no connection to the observed oil seeps. There are two different types of hydrocarbon spills depending on their link to tectonic deformations (oil seeps) and outcropping mature source rocks (oil impregnations), in addition to the existence of two generations of hydrocarbon spills in Northwest Tunisia (Lower Cretaceous/Ypresian).Keywords: petroleum seeps, source rocks, biomarkers, statistic, Northern Tunisia
Procedia PDF Downloads 693627 The Impact of Model Specification Decisions on the Teacher ValuE-added Effectiveness: Choosing the Correct Predictors
Authors: Ismail Aslantas
Abstract:
Value-Added Models (VAMs), the statistical methods for evaluating the effectiveness of teachers and schools based on student achievement growth, has attracted decision-makers’ and researchers’ attention over the last decades. As a result of this attention, many studies have conducted in recent years to discuss these statistical models from different aspects. This research focused on the importance of conceptual variables in VAM estimations; therefor, this research was undertaken to examine the extent to which value-added effectiveness estimates for teachers can be affected by using context predictions. Using longitudinal data over three years from the international school context, value-added teacher effectiveness was estimated by ordinary least-square value-added models, and the effectiveness of the teachers was examined. The longitudinal dataset in this study consisted of three major sources: students’ attainment scores up to three years and their characteristics, teacher background information, and school characteristics. A total of 1,027 teachers and their 35,355 students who were in eighth grade were examined for understanding the impact of model specifications on the value-added teacher effectiveness evaluation. Models were created using selection methods that adding a predictor on each step, then removing it and adding another one on a subsequent step and evaluating changes in model fit was checked by reviewing changes in R² values. Cohen’s effect size statistics were also employed in order to find out the degree of the relationship between teacher characteristics and their effectiveness. Overall, the results indicated that prior attainment score is the most powerful predictor of the current attainment score. 47.1 percent of the variation in grade 8 math score can be explained by the prior attainment score in grade 7. The research findings raise issues to be considered in VAM implementations for teacher evaluations and make suggestions to researchers and practitioners.Keywords: model specification, teacher effectiveness, teacher performance evaluation, value-added model
Procedia PDF Downloads 1333626 Urbanization and Income Inequality in Thailand
Authors: Acumsiri Tantikarnpanit
Abstract:
This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020. Using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for nineteen selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (Labor Force Survey: LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.Keywords: income inequality, nighttime light, population density, Thailand, urbanization
Procedia PDF Downloads 763625 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge
Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi
Abstract:
Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring
Procedia PDF Downloads 2083624 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing
Procedia PDF Downloads 4293623 Effects of Neem (Azadirachta indica A. Juss) Kernel Inclusion in Broiler Diet on Growth Performance, Organ Weight and Gut Morphometry
Authors: Olatundun Bukola Ezekiel, Adejumo Olusoji
Abstract:
A feeding trial was conducted with 100 two-weeks old broiler chicken to evaluate the influence of inclusion in broiler diets at 0, 2.5, 5, 7.5 and 10% neem kernel (used to replace equal quantity of maize) on their performance, organ weight and gut morphometry. The birds were randomly allotted to five dietary treatments, each treatment having four replicates consisting of five broilers in a completely randomized design. The diets were formulated to be iso-nitrogenous (23% CP). Weekly feed intake and changes in body weight were calculated and feed efficiency determined. At the end of the 28-day feeding trial, four broilers per treatment were selected and sacrificed for carcass evaluation. Results were subjected to statistical analysis using the analysis of variance procedures of Statistical Analysis Software The treatment means were presented with group standard errors of means and where significant, were compared using the Duncan multiple range test of the same software. The results showed that broilers fed 2.5% neem kernel inclusion diets had growth performance statistically comparable to those fed the control diet. Birds on 5, 7.5 and 10% neem kernel diets showed significant (P<0.05) increase in relative weight of liver. The absolute weight of spleen also increased significantly (P<0.05) in birds on 10 % neem kernel diet. More than 5 % neem kernel diets gave significant (P<0.05) increase in the relative weight of the kidney. The length of the small intestine significantly increased in birds fed 7.5 and 10% neem kernel diets. Significant differences (P<0.05) did not occur in the length of the large intestine, right and left caeca. It is recommended that neem kernel can be included up to 2.5% in broiler chicken diet without any deleterious effects on the performance and physiological status of the birds.Keywords: broiler chicken, growth performance, gut morphometry, neem kernel, organ weight
Procedia PDF Downloads 7633622 A Proposed Mechanism for Skewing Symmetric Distributions
Authors: M. T. Alodat
Abstract:
In this paper, we propose a mechanism for skewing any symmetric distribution. The new distribution is called the deflation-inflation distribution (DID). We discuss some statistical properties of the DID such moments, stochastic representation, log-concavity. Also we fit the distribution to real data and we compare it to normal distribution and Azzlaini's skew normal distribution. Numerical results show that the DID fits the the tree ring data better than the other two distributions.Keywords: normal distribution, moments, Fisher information, symmetric distributions
Procedia PDF Downloads 6583621 Ibrutinib and the Potential Risk of Cardiac Failure: A Review of Pharmacovigilance Data
Authors: Abdulaziz Alakeel, Roaa Alamri, Abdulrahman Alomair, Mohammed Fouda
Abstract:
Introduction: Ibrutinib is a selective, potent, and irreversible small-molecule inhibitor of Bruton's tyrosine kinase (BTK). It forms a covalent bond with a cysteine residue (CYS-481) at the active site of Btk, leading to inhibition of Btk enzymatic activity. The drug is indicated to treat certain type of cancers such as mantle cell lymphoma (MCL), chronic lymphocytic leukaemia and Waldenström's macroglobulinaemia (WM). Cardiac failure is a condition referred to inability of heart muscle to pump adequate blood to human body organs. There are multiple types of cardiac failure including left and right-sided heart failure, systolic and diastolic heart failures. The aim of this review is to evaluate the risk of cardiac failure associated with the use of ibrutinib and to suggest regulatory recommendations if required. Methodology: Signal Detection team at the National Pharmacovigilance Center (NPC) of Saudi Food and Drug Authority (SFDA) performed a comprehensive signal review using its national database as well as the World Health Organization (WHO) database (VigiBase), to retrieve related information for assessing the causality between cardiac failure and ibrutinib. We used the WHO- Uppsala Monitoring Centre (UMC) criteria as standard for assessing the causality of the reported cases. Results: Case Review: The number of resulted cases for the combined drug/adverse drug reaction are 212 global ICSRs as of July 2020. The reviewers have selected and assessed the causality for the well-documented ICSRs with completeness scores of 0.9 and above (35 ICSRs); the value 1.0 presents the highest score for best-written ICSRs. Among the reviewed cases, more than half of them provides supportive association (four probable and 15 possible cases). Data Mining: The disproportionality of the observed and the expected reporting rate for drug/adverse drug reaction pair is estimated using information component (IC), a tool developed by WHO-UMC to measure the reporting ratio. Positive IC reflects higher statistical association while negative values indicates less statistical association, considering the null value equal to zero. The results of (IC=1.5) revealed a positive statistical association for the drug/ADR combination, which means “Ibrutinib” with “Cardiac Failure” have been observed more than expected when compared to other medications available in WHO database. Conclusion: Health regulators and health care professionals must be aware for the potential risk of cardiac failure associated with ibrutinib and the monitoring of any signs or symptoms in treated patients is essential. The weighted cumulative evidences identified from causality assessment of the reported cases and data mining are sufficient to support a causal association between ibrutinib and cardiac failure.Keywords: cardiac failure, drug safety, ibrutinib, pharmacovigilance, signal detection
Procedia PDF Downloads 1293620 Impact of Changes in Travel Behavior Triggered by the Covid-19 Pandemic on Tourist Ininfrastructure. Water Reservoirs of the Vltava Cascade (Czechia) Case Study
Authors: Jiří Vágner, Dana Fialová
Abstract:
The Covid-19 pandemic and its effects have triggered significant changes in travel behavior. On the contrary to a deep decline in international tourism, domestic tourism has recovered. It has not fully replaced the total volume of national tourism so far. However, from a regional point of view, and especially according to the type of destinations, regional targeting has changed significantly compared to the previous period. Urban destinations, which used to be the domain of foreign tourists, have been relatively orphaned, in contrast to destinations tied to natural attractions, which have seen seasonal increases. Even here, at a lower hierarchical geographic level, we can observe the differentiation resulting from the existing localization and infrastructure. The case study is focused on the three largest water reservoirs of the Vltava Cascade in Czechia– Lipno, Orlík, and Slapy. Based on a detailed field survey, in the periods before and during the pandemic, as well as available statistical data (Tourdata; Czech Statistical Office, Czech Cadaster and Ordnance Survey), different trends in the exploitation of these destinations with regard to existing or planned infrastructure are documented, analyzed and explained. This gives us the opportunity to discuss on concrete examples of generally known phenomena that are usually neglected in tourism: slum, brownfield, greenfield. Changes in travel behavior – especially the focus on spending leisure time individually in naturally attractive destinations – can affect the use of sites, which can be defined as a tourist or recreational slum, brownfield, but also as a tourist greenfield development. Sociocultural changes and perception of destinations by tourists and other actors represent, besides environmental changes, major trends in current tourism.Keywords: Covid-19 pandemic, czechia, sociocultural and environmental impacts, tourist infrastructure, travel behavior, the Vltava Cascade water reservoirs
Procedia PDF Downloads 1463619 Effectiveness of Impairment Specified Muscle Strengthening Programme in a Group of Disabled Athletes
Authors: A. L. I. Prasanna, E. Liyanage, S. A. Rajaratne, K. P. A. P. Kariyawasam, A. A. J. Rajaratne
Abstract:
Maintaining or improving the muscle strength of the injured body part is essential to optimize performance among disabled athletes. General conditioning and strengthening exercises might be ineffective if not sufficiently intense enough or targeted for each participant’s specific impairment. Specific strengthening programme, targeted to the affected body part, are essential to improve the strength of impaired muscles and increase in strength will help reducing the impact of disability. Methods: The muscle strength of hip, knee and ankle joints was assessed in a group of randomly selected disabled athletes, using the Medical Research Council (MRC) grading. Those having muscle strength of grade 4 or less were selected for this study (24 in number) and were given and a custom made exercise program designed to strengthen their hip, knee or ankle joint musculature, according to the muscle or group of muscles affected. Effectiveness of the strengthening program was assessed after a period of 3 months. Results: Statistical analysis was done using the Minitab 16 statistical software. A Mann-Whitney U test was used to compare the strength of muscle group before and after exercise programme. A significant difference was observed after the three month strengthening program for knee flexors (Left and Right) (P =0.0889, 0.0312) hip flexors (left and right) (P=0.0312, 0.0466), hip extensors (Left and Right) (P=0.0478, 0.0513), ankle plantar flexors (Left and Right) (P=0.0466, 0.0423) and right ankle dorsiflexors (P= 0.0337). No significant difference of strength was observed after the strengthening program in the knee extensors (left and right), hip abductors (left and right) and left ankle dorsiflexors. Conclusion: Impairment specific exercise programme appear to be beneficial for disabled athletes to significantly improve the muscle strength of the affected joints.Keywords: muscle strengthening programme, disabled athletes, physiotherapy, rehabilitation sciences
Procedia PDF Downloads 3573618 Determination of Anti-Fungal Activity of Cedrus deodara Oil against Oligoporus placentus, Trametes versicolor and Xylaria acuminata on Populus deltoids
Authors: Sauradipta Ganguly, Akhato Sumi, Sanjeet Kumar Hom, Ajan T. Lotha
Abstract:
Populus deltoides is a hardwood used predominantly for the manufacturing of plywood, matchsticks, and paper in India and hence has a higher economical significance. Wood-decaying fungi cause serious damage to Populus deltoides products, as the wood itself is perishable and vulnerable to decaying agents, decreasing their aesthetical value which in return results in significant monetary loss for the wood industries concerned. The aim of the study was to determine the antifungal activity of Cedrus deodara oil against three primary wood-decaying fungi namely white-rot fungi (Trametes versicolor), brown-rot fungi (Oligoporus placentus) and soft-rot fungi (Xylaria acuminata) on Populus deltoides samples under optimum laboratory conditions. The susceptibility of Populus deltoides samples on the fungal attack and the ability of deodar oil to control colonization of the wood rotting fungi on the samples were assessed. Three concentrations of deodar oil were considered for the study as treating solutions, i.e., 4%, 5%, and 6%. The Populus deltoides samples were treated with treating solutions, and the ability of the same to prevent a fungal attack on the samples were assessed using accelerated test in the laboratory at Biochemical Oxygen Demand incubator at temperature (25 ± 2°C) and relative humidity 70 ± 4%. Efficacy test and statistical analysis of deodar oil against Trametes versicolor, Oligoporus placentus, and Xylariaacuminataon P. deltoides samples exhibited light, minor and negligible mycelia growth at 4 %, 5% and 6% concentrations of deodar oil, respectively. Whereas, moderate to heavy attack was observed on the surface of the control samples. Statistical analysis further established that the treatments were statistically significant and had significantly inhibited fungal growth of all the three fungus spp by almost 3 to 5 times.Keywords: populus deltoides, Trametes versicolor, Oligoporus placentus, Xylaria acuminata, Deodar oil, treatment
Procedia PDF Downloads 1253617 AI Ethical Values as Dependent on the Role and Perspective of the Ethical AI Code Founder- A Mapping Review
Authors: Moshe Davidian, Shlomo Mark, Yotam Lurie
Abstract:
With the rapid development of technology and the concomitant growth in the capability of Artificial Intelligence (AI) systems and their power, the ethical challenges involved in these systems are also evolving and increasing. In recent years, various organizations, including governments, international institutions, professional societies, civic organizations, and commercial companies, have been choosing to address these various challenges by publishing ethical codes for AI systems. However, despite the apparent agreement that AI should be “ethical,” there is debate about the definition of “ethical artificial intelligence.” This study investigates the various AI ethical codes and their key ethical values. From the vast collection of codes that exist, it analyzes and compares 25 ethical codes that were found to be representative of different types of organizations. In addition, as part of its literature review, the study overviews data collected in three recent reviews of AI codes. The results of the analyses demonstrate a convergence around seven key ethical values. However, the key finding is that the different AI ethical codes eventually reflect the type of organization that designed the code; i.e., the organizations’ role as regulator, user, or developer affects the view of what ethical AI is. The results show a relationship between the organization’s role and the dominant values in its code. The main contribution of this study is the development of a list of the key values for all AI systems and specific values that need to impact the development and design of AI systems, but also allowing for differences according to the organization for which the system is being developed. This will allow an analysis of AI values in relation to stakeholders.Keywords: artificial intelligence, ethical codes, principles, values
Procedia PDF Downloads 1073616 Analysis of NMDA Receptor 2B Subunit Gene (GRIN2B) mRNA Expression in the Peripheral Blood Mononuclear Cells of Alzheimer's Disease Patients
Authors: Ali̇ Bayram, Semih Dalkilic, Remzi Yigiter
Abstract:
N-methyl-D-aspartate (NMDA) receptor is a subtype of glutamate receptor and plays a pivotal role in learning, memory, neuronal plasticity, neurotoxicity and synaptic mechanisms. Animal experiments were suggested that glutamate-induced excitotoxic injuriy and NMDA receptor blockage lead to amnesia and other neurodegenerative diseases including Alzheimer’s disease (AD), Huntington’s disease, amyotrophic lateral sclerosis. Aim of this study is to investigate association between NMDA receptor coding gene GRIN2B expression level and Alzheimer disease. The study was approved by the local ethics committees, and it was conducted according to the principles of the Declaration of Helsinki and guidelines for the Good Clinical Practice. Peripheral blood was collected 50 patients who diagnosed AD and 49 healthy control individuals. Total RNA was isolated with RNeasy midi kit (Qiagen) according to manufacturer’s instructions. After checked RNA quality and quantity with spectrophotometer, GRIN2B expression levels were detected by quantitative real time PCR (QRT-PCR). Statistical analyses were performed, variance between two groups were compared with Mann Whitney U test in GraphpadInstat algorithm with 95 % confidence interval and p < 0.05. After statistical analyses, we have determined that GRIN2B expression levels were down regulated in AD patients group with respect to control group. But expression level of this gene in each group was showed high variability. İn this study, we have determined that NMDA receptor coding gene GRIN2B expression level was down regulated in AD patients when compared with healthy control individuals. According to our results, we have speculated that GRIN2B expression level was associated with AD. But it is necessary to validate these results with bigger sample size.Keywords: Alzheimer’s disease, N-methyl-d-aspartate receptor, NR2B, GRIN2B, mRNA expression, RT-PCR
Procedia PDF Downloads 3943615 Feasibility of Simulating External Vehicle Aerodynamics Using Spalart-Allmaras Turbulence Model with Adjoint Method in OpenFOAM and Fluent
Authors: Arpit Panwar, Arvind Deshpande
Abstract:
The study of external vehicle aerodynamics using Spalart-Allmaras turbulence model with adjoint method was conducted. The accessibility and ease of working with the Fluent module of ANSYS and OpenFOAM were considered. The objective of the study was to understand and analyze the possibility of bringing high-level aerodynamic simulation to the average consumer vehicle. A form-factor of BMW M6 vehicle was designed in Solidworks, which was analyzed in OpenFOAM and Fluent. The turbulence model being a single equation provides much faster convergence rate when clubbed with the adjoint method. Fluent being commercial software still does not allow us to solve Spalart-Allmaras turbulence model using the adjoint method. Hence, the turbulence model was solved using the SIMPLE method in Fluent. OpenFOAM being an open source provide flexibility in simulation but is not user-friendly. It supports solving the defined turbulence model with the adjoint method. The result generated from the simulation gives us acceptable values of drag, when validated with the result of percentage error in drag values for a notch-back vehicle model on an extensive simulation produced at 6th ANSA and μETA conference, Greece. The success of this approach will allow us to bring more aerodynamic vehicle body design to all segments of the automobile and not limiting it to just the high-end sports cars.Keywords: Spalart-Allmaras turbulence model, OpenFOAM, adjoint method, SIMPLE method, vehicle aerodynamic design
Procedia PDF Downloads 2003614 Count Data Regression Modeling: An Application to Spontaneous Abortion in India
Authors: Prashant Verma, Prafulla K. Swain, K. K. Singh, Mukti Khetan
Abstract:
Objective: In India, around 20,000 women die every year due to abortion-related complications. In the modelling of count variables, there is sometimes a preponderance of zero counts. This article concerns the estimation of various count regression models to predict the average number of spontaneous abortion among women in the Punjab state of India. It also assesses the factors associated with the number of spontaneous abortions. Materials and methods: The study included 27,173 married women of Punjab obtained from the DLHS-4 survey (2012-13). Poisson regression (PR), Negative binomial (NB) regression, zero hurdle negative binomial (ZHNB), and zero-inflated negative binomial (ZINB) models were employed to predict the average number of spontaneous abortions and to identify the determinants affecting the number of spontaneous abortions. Results: Statistical comparisons among four estimation methods revealed that the ZINB model provides the best prediction for the number of spontaneous abortions. Antenatal care (ANC) place, place of residence, total children born to a woman, woman's education and economic status were found to be the most significant factors affecting the occurrence of spontaneous abortion. Conclusions: The study offers a practical demonstration of techniques designed to handle count variables. Statistical comparisons among four estimation models revealed that the ZINB model provided the best prediction for the number of spontaneous abortions and is recommended to be used to predict the number of spontaneous abortions. The study suggests that women receive institutional Antenatal care to attain limited parity. It also advocates promoting higher education among women in Punjab, India.Keywords: count data, spontaneous abortion, Poisson model, negative binomial model, zero hurdle negative binomial, zero-inflated negative binomial, regression
Procedia PDF Downloads 1553613 Non-Dominated Sorting Genetic Algorithm (NSGA-II) for the Redistricting Problem in Mexico
Authors: Antonin Ponsich, Eric Alfredo Rincon Garcia, Roman Anselmo Mora Gutierrez, Miguel Angel Gutierrez Andrade, Sergio Gerardo De Los Cobos Silva, Pedro Lara Velzquez
Abstract:
The electoral zone design problem consists in redrawing the boundaries of legislative districts for electoral purposes in such a way that federal or state requirements are fulfilled. In Mexico, this process has been historically carried out by the National Electoral Institute (INE), by optimizing an integer nonlinear programming model, in which population equality and compactness of the designed districts are considered as two conflicting objective functions, while contiguity is included as a hard constraint. The solution technique used by the INE is a Simulated Annealing (SA) based algorithm, which handles the multi-objective nature of the problem through an aggregation function. The present work represents the first intent to apply a classical Multi-Objective Evolutionary Algorithm (MOEA), the second version of the Non-dominated Sorting Genetic Algorithm (NSGA-II), to this hard combinatorial problem. First results show that, when compared with the SA algorithm, the NSGA-II obtains promising results. The MOEA manages to produce well-distributed solutions over a wide-spread front, even though some convergence troubles for some instances constitute an issue, which should be corrected in future adaptations of MOEAs to the redistricting problem.Keywords: multi-objective optimization, NSGA-II, redistricting, zone design problem
Procedia PDF Downloads 3673612 The Impact of Human Rights Violation in Modern Society
Authors: Hanania Nasan Shokry Abdelmasih
Abstract:
The interface between improvement and human rights has long been the subject of scholarly debate. As an end result, a hard and fast of principles, starting from the proper improvement to a human rights-based totally technique to development, have been adopted to understand the dynamics among the two concepts. In spite of those attempts, the precise link between development and human rights is not yet fully understood. However, the inevitable interdependence between the two standards and the idea that development efforts must be made while respecting human rights have received prominence in recent years. Then again, the emergence of sustainable development as a widely spread method in development dreams and rules similarly complicates this unresolved convergence. The place of sustainable improvement inside the human rights discourse and its role in ensuring the sustainability of improvement programs require systematic research. The purpose of this newsletter is, therefore, to take a look at the relationship between development and human rights, with particular attention to the area of the standards of sustainable improvement in international human rights regulation. It's going to examine whether it recognizes the proper to achieve sustainable improvement. Hence, the Article states that the principles of sustainable improvement are diagnosed immediately or implicitly in numerous human rights devices, which is an affirmative solution to the question posed above. Therefore, this report scrutinizes worldwide and local human rights gadgets, as well as the case regulation and interpretations of human rights in our bodies, to support this speculation.Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.
Procedia PDF Downloads 283611 Bright Light Effects on the Concentration and Diffuse Attention Reaction Time, Tension, Angry, Fatigue and Alertness among Shift Workers
Authors: Mohammad Imani, JabraeilNasl Seraji, Abolfazl Zakerian
Abstract:
Background: Reaction time is the amount of time it takes to respond to a stimulus. In fact The time that passes between the introduction of a stimulus and the reaction by the subject to that stimulus. The aim of this interventional study is evaluation of bright light effects on concentration and diffuse attention reaction time, tension, angry, fatigue and alertness among shift workers. There are several incentives that can reduce the reaction time or added. Bright light as one of the environmental factors can reduce reaction time. Material &Method: This cross-sectional descriptive study was conducted in 1391, in 88 subjects (44 Fixed morning worker and 44 shift worker ) In a 24 h time (13-16-19-22-1-4-7-10) in an ordinary light situation after a randomly selected sample size calculation, concentration and diffuse attention test (reaction time) has been done. After intervention and using of bright light (4500lux), again reaction time test was done. After analyzing by ElISA method obtained data were analyzed by statistical software SPSS 19 and using T-test and ANOVA statistical analysis. Results: Between average of reaction time tests in ordinary light exposed to fixed morning workers and bright light exposed to shift worker, with 95% CI, (P>%5) there was no significant relationship. After the intervention and the use of bright light (4500 lux),between average of concentration and diffused attention reaction time tests in ordinary light exposure on the fixed morning workers and bright light exposure shift workers with 95% CI, (P<5%) there was significant relationship. Conclusion: In sometimes of 24 h during ordinary light exposure concentration and diffused attention reaction time has changed in shift workers. After intervention, during bright light (4500lux) exposure as a light shower, focused and diffuse attention reaction time, tension ,angry and fatigue decreased.Keywords: bright light, reaction time, tension, angry, fatigue, alertness
Procedia PDF Downloads 3853610 Variability of Energy Efficiency with the Application of Technologies Embedded in Locomotives of a Heavy Haul Railway: Case Study of Vitoria Minas Railway, Brazil
Authors: Eric Wilson Santos Cabral, Marta Monteiro Da Costa Cruz, Rodrigo Pirola Pestana, Vivian Andréa Parreira
Abstract:
In the transportation sector in Brazil, there is a great challenge that is the maintenance of profit in the face of the great variation in the price of diesel. This directly affects the variable cost of transport companies. Within the railways, part of the great challenges is to overcome the annual budget, cargo and ore transported, thus reducing costs compared to previous years, becoming more efficient each year. Within this scenario, the railway companies are looking for effective measures, aiming at reducing the ratio of liter of diesel consumed by KTKB (Kilometer Gross Ton multiplied by thousand). This ratio represents the indicator of energy efficiency of some railroads in Brazil and in other countries. In this study, we sought to analyze the behavior of the energy efficiency indicator on two parts: The first, with the application of technologies used in locomotives, such as the start-stop system of the diesel engine and the system of tracking and monitoring of fuel. The second, evaluation of the behavior of the variation of the type of cargo transported (loading mix). The study focused on locomotive technology will be carried out using statistical analysis, behavioral evaluation in different operating conditions, such as maneuvers for trains, service trains and freight trains. The analysis will also cover the evaluation of the loading mix made using statistical analysis of the existing railroad database, comparing the energy efficiency per loading mine and type of product. With the completion of this study, the railway undertakings should be able to better target decision-making in order to achieve substantial reductions in transport costs.Keywords: railway transport, energy efficiency, railway technology, fuel consumption
Procedia PDF Downloads 3043609 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation
Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell
Abstract:
Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models
Procedia PDF Downloads 1463608 A Web and Cloud-Based Measurement System Analysis Tool for the Automotive Industry
Authors: C. A. Barros, Ana P. Barroso
Abstract:
Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.Keywords: automotive Industry, industry 4.0, Internet of Things, IATF 16949:2016, measurement system analysis
Procedia PDF Downloads 2143607 Evaluation of Nuts as a Source of Selenium in Diet
Authors: Renata Markiewicz-Żukowska, Patryk Nowakowski, Sylwia K. Naliwajko, Jakub M. Bołtryk, Katarzyna Socha, Anna Puścion-Jakubik, Jolanta Soroczyńska, Maria H. Borawska
Abstract:
Selenium (Se) is an essential element for human health. As an integral part of glutathione peroxidase, it has antioxidant, anti-inflammatory and anticancer activities. Unfortunately, Se dietary intake is often insufficient, especially in regions where the soil is low in Se. Therefore, in search for good sources of Se, the content of this element in food products should be monitored. Food product can be considered as a source of Se when its standard portion covers above 15% of recommended daily allowance. In the case of nuts, 42g is recognized as the standard portion. The aim of this study was to determine the Se content in nuts and to answer the question of whether the studied nuts can be considered as a source of Se in the diet. The material for the study consisted of 10 types of nuts (12 samples of each one): almonds, Brazil nuts, cashews, hazelnuts, macadamia nuts, peanuts, pecans, pine nuts, pistachios and walnuts. The nuts were mineralized using microwave technique (Berghof, Germany). The content of Se was determined by atomic absorption spectrometry method with electrothermal atomization in a graphite tube with Zeeman background correction (Hitachi, Japan). The accuracy of the method was verified on certified reference material: Simulated Diet D. The statistical analysis was performed using Statistica v. 13.0 software. Statistical significance was determined at p < 0.05 level. The highest content of Se was found in Brazil nuts (4566.21 ± 3393.9 µg/kg) and the lowest in almonds (36.07 ± 18.8 µg/kg). A standard portion (42g) of almonds, brazil nuts, cashews, hazelnuts, macadamia nuts, peanuts, pecans, pine nuts, pistachios and walnuts covers the recommended daily allowance for Se respectively in: 2, 192, 28, 2, 16, 7, 4, 3, 12, 6%. Brazil nuts, cashews and macadamia nuts can be considered as a good source of Se in diet.Keywords: atomic absorption spectrometry, diet, nuts, selenium
Procedia PDF Downloads 1853606 Statistical Optimization of Adsorption of a Harmful Dye from Aqueous Solution
Abstract:
Textile industries cater to varied customer preferences and contribute substantially to the economy. However, these textile industries also produce a considerable amount of effluents. Prominent among these are the azo dyes which impart considerable color and toxicity even at low concentrations. Azo dyes are also used as coloring agents in food and pharmaceutical industry. Despite their applications, azo dyes are also notorious pollutants and carcinogens. Popular techniques like photo-degradation, biodegradation and the use of oxidizing agents are not applicable for all kinds of dyes, as most of them are stable to these techniques. Chemical coagulation produces a large amount of toxic sludge which is undesirable and is also ineffective towards a number of dyes. Most of the azo dyes are stable to UV-visible light irradiation and may even resist aerobic degradation. Adsorption has been the most preferred technique owing to its less cost, high capacity and process efficiency and the possibility of regenerating and recycling the adsorbent. Adsorption is also most preferred because it may produce high quality of the treated effluent and it is able to remove different kinds of dyes. However, the adsorption process is influenced by many variables whose inter-dependence makes it difficult to identify optimum conditions. The variables include stirring speed, temperature, initial concentration and adsorbent dosage. Further, the internal diffusional resistance inside the adsorbent particle leads to slow uptake of the solute within the adsorbent. Hence, it is necessary to identify optimum conditions that lead to high capacity and uptake rate of these pollutants. In this work, commercially available activated carbon was chosen as the adsorbent owing to its high surface area. A typical azo dye found in textile effluent waters, viz. the monoazo Acid Orange 10 dye (CAS: 1936-15-8) has been chosen as the representative pollutant. Adsorption studies were mainly focused at obtaining equilibrium and kinetic data for the batch adsorption process at different process conditions. Studies were conducted at different stirring speed, temperature, adsorbent dosage and initial dye concentration settings. The Full Factorial Design was the chosen statistical design framework for carrying out the experiments and identifying the important factors and their interactions. The optimum conditions identified from the experimental model were validated with actual experiments at the recommended settings. The equilibrium and kinetic data obtained were fitted to different models and the model parameters were estimated. This gives more details about the nature of adsorption taking place. Critical data required to design batch adsorption systems for removal of Acid Orange 10 dye and identification of factors that critically influence the separation efficiency are the key outcomes from this research.Keywords: acid orange 10, activated carbon, optimum adsorption conditions, statistical design
Procedia PDF Downloads 1693605 Mediation Analysis of the Efficacy of the Nimotuzumab-Cisplatin-Radiation (NCR) Improve Overall Survival (OS): A HPV Negative Oropharyngeal Cancer Patient (HPVNOCP) Cohort
Authors: Akshay Patil
Abstract:
Objective: Mediation analysis identifies causal pathways by testing the relationships between the NCR, the OS, and an intermediate variable that mediates the relationship between the Nimotuzumab-cisplatin-radiation (NCR) and OS. Introduction: In randomized controlled trials, the primary interest is in the mechanisms by which an intervention exerts its effects on the outcomes. Clinicians are often interested in how the intervention works (or why it does not work) through hypothesized causal mechanisms. In this work, we highlight the value of understanding causal mechanisms in randomized trial by applying causal mediation analysis in a randomized trial in oncology. Methods: Data was obtained from a phase III randomized trial (Subgroup of HPVNOCP). NCR is reported to significantly improve the OS of patients locally advanced head and neck cancer patients undergoing definitive chemoradiation. Here, based on trial data, the mediating effect of NCR on patient overall survival was systematically quantified through progression-free survival(PFS), disease free survival (DFS), Loco-regional failure (LRF), and the disease control rate (DCR), Overall response rate (ORR). Effects of potential mediators on the HR for OS with NCR versus cisplatin-radiation (CR) were analyzed by Cox regression models. Statistical analyses were performed using R software Version 3.6.3 (The R Foundation for Statistical Computing) Results: Effects of potential mediator PFS was an association between NCR treatment and OS, with an indirect-effect (IE) 0.76(0.62 – 0.95), which mediated 60.69% of the treatment effect. Taking into account baseline confounders, the overall adjusted hazard ratio of death was 0.64 (95% CI: 0.43 – 0.96; P=0.03). The DFS was also a significant mediator and had an IE 0.77 (95% CI; 0.62-0.93), 58% mediated). Smaller mediation effects (maximum 27%) were observed for LRF with IE 0.88(0.74 – 1.06). Both DCR and ORR mediated 10% and 15%, respectively, of the effect of NCR vs. CR on the OS with IE 0.65 (95% CI; 0.81 – 1.08) and 0.94(95% CI; 0.79 – 1.04). Conclusion: Our findings suggest that PFS and DFS were the most important mediators of the OS with nimotuzumab to weekly cisplatin-radiation in HPVNOCP.Keywords: mediation analysis, cancer data, survival, NCR, HPV negative oropharyngeal
Procedia PDF Downloads 1453604 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis
Authors: Arin Ghazarian, Cyril Rakovski
Abstract:
Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter calledKeywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases
Procedia PDF Downloads 152