Search results for: survival and hazard functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3956

Search results for: survival and hazard functions

3896 Association of Genetically Proxied Cholesterol-Lowering Drug Targets and Head and Neck Cancer Survival: A Mendelian Randomization Analysis

Authors: Danni Cheng

Abstract:

Background: Preclinical and epidemiological studies have reported potential protective effects of low-density lipoprotein cholesterol (LDL-C) lowering drugs on head and neck squamous cell cancer (HNSCC) survival, but the causality was not consistent. Genetic variants associated with LDL-C lowering drug targets can predict the effects of their therapeutic inhibition on disease outcomes. Objective: We aimed to evaluate the causal association of genetically proxied cholesterol-lowering drug targets and circulating lipid traits with cancer survival in HNSCC patients stratified by human papillomavirus (HPV) status using two-sample Mendelian randomization (MR) analyses. Method: Single-nucleotide polymorphisms (SNPs) in gene region of LDL-C lowering drug targets (HMGCR, NPC1L1, CETP, PCSK9, and LDLR) associated with LDL-C levels in genome-wide association study (GWAS) from the Global Lipids Genetics Consortium (GLGC) were used to proxy LDL-C lowering drug action. SNPs proxy circulating lipids (LDL-C, HDL-C, total cholesterol, triglycerides, apoprotein A and apoprotein B) were also derived from the GLGC data. Genetic associations of these SNPs and cancer survivals were derived from 1,120 HPV-positive oropharyngeal squamous cell carcinoma (OPSCC) and 2,570 non-HPV-driven HNSCC patients in VOYAGER program. We estimated the causal associations of LDL-C lowering drugs and circulating lipids with HNSCC survival using the inverse-variance weighted method. Results: Genetically proxied HMGCR inhibition was significantly associated with worse overall survival (OS) in non-HPV-drive HNSCC patients (inverse variance-weighted hazard ratio (HR IVW), 2.64[95%CI,1.28-5.43]; P = 0.01) but better OS in HPV-positive OPSCC patients (HR IVW,0.11[95%CI,0.02-0.56]; P = 0.01). Estimates for NPC1L1 were strongly associated with worse OS in both total HNSCC (HR IVW,4.17[95%CI,1.06-16.36]; P = 0.04) and non-HPV-driven HNSCC patients (HR IVW,7.33[95%CI,1.63-32.97]; P = 0.01). A similar result was found that genetically proxied PSCK9 inhibitors were significantly associated with poor OS in non-HPV-driven HNSCC (HR IVW,1.56[95%CI,1.02 to 2.39]). Conclusion: Genetically proxied long-term HMGCR inhibition was significantly associated with decreased OS in non-HPV-driven HNSCC and increased OS in HPV-positive OPSCC. While genetically proxied NPC1L1 and PCSK9 had associations with worse OS in total and non-HPV-driven HNSCC patients. Further research is needed to understand whether these drugs have consistent associations with head and neck tumor outcomes.

Keywords: Mendelian randomization analysis, head and neck cancer, cancer survival, cholesterol, statin

Procedia PDF Downloads 61
3895 Application of Failure Mode and Effects Analysis (FMEA) on the Virtual Process Hazard Analysis of Acetone Production Process

Authors: Princes Ann E. Prieto, Denise F. Alpuerto, John Rafael C. Unlayao, Neil Concibido, Monet Concepcion Maguyon-Detras

Abstract:

Failure Mode and Effects Analysis (FMEA) has been used in the virtual Process Hazard Analysis (PHA) of the Acetone production process through the dehydrogenation of isopropyl alcohol, for which very limited process risk assessment has been published. In this study, the potential failure modes, effects, and possible causes of selected major equipment in the process were identified. During the virtual FMEA mock sessions, the risks in the process were evaluated and recommendations to reduce and/or mitigate the process risks were formulated. The risk was estimated using the calculated risk priority number (RPN) and was classified into four (4) levels according to their effects on acetone production. Results of this study were also used to rank the criticality of equipment in the process based on the calculated criticality rating (CR). Bow tie diagrams were also created for the critical hazard scenarios identified in the study.

Keywords: chemical process safety, failure mode and effects analysis (FMEA), process hazard analysis (PHA), process safety management (PSM)

Procedia PDF Downloads 93
3894 Consumer Health Risk Assessment from Some Heavy Metal Bioaccumulation in Common Carp (Cyprinus Carpio) from Lake Koka, Ethiopia

Authors: Mathewos Temesgen, Lemi Geleta

Abstract:

Lake Koka is one of the Ethiopian Central Rift Valleys lakes, where the absorbance of domestic, agricultural, and industrial waste from the nearby industrial and agro-industrial activities is very common. The aim of this research was to assess the heavy metal bioaccumulation in edible parts of common carp (Cyprinus carpio) in Lake Koka and the health risks associated with the dietary intake of the fish. Three sampling sites were selected randomly for primary data collection. Physicochemical parameters (pH, Total Dissolved Solids, Dissolved Oxygen and Electrical Conductivity) were measured in-situ. Four heavy metals (Cd, Cr, Pb, and Zn) in water and bio-accumulation in the edible parts of the fish were analyzed with flame atomic absorption spectrometry. The mean values of TDS, EC, DO and pH of the lake water were 458.1 mg/L, 905.7 µ s/cm, 7.36 mg/L, and 7.9, respectively. The mean concentrations of Zn, Cr, and Cd in the edible part of fish were also 0.18 mg/kg, ND-0.24 mg/kg, and ND-0.03 mg/kg, respectively. Pb was, however, not identified. The amount of Cr in the examined fish muscle was above the level set by FAO, and the accumulation of the metals showed marked differences between sampling sites (p<0.05). The concentrations of Cd, Pb and were below the maximum permissible limit. The results also indicated that Cr has a high transfer factor value and Zn has the lowest. The carcinogenic hazard ratio values were below the threshold value (<1) for the edible parts of fish. The estimated weekly intake of heavy metals from fish muscles ranked as Cr>Zn>Cd, but the values were lower than the Reference Dose limit for metals. The carcinogenic risk values indicated a low health risk due to the intake of individual metals from fish. Furthermore, the hazard index of the edible part of fish was less than unity. Generally, the water quality is not a risk for the survival and reproduction of fish, and the heavy metal contents in the edible parts of fish exhibited low carcinogenic risk through the food chain.

Keywords: bio-accumulation, cyprinus carpio, hazard index, heavy metals, Lake Koka

Procedia PDF Downloads 70
3893 The Impact of Technology on Computer Systems and Technology

Authors: Bishoy Abouelsoud Saad Amin

Abstract:

This paper examines the use of computer and its related health hazard among computer users in South-Western zone of Nigeria. Two hundred and eighteen (218) computer users constituted the population used to evaluate association between posture, extensive computer use and related health hazard. The instruments for the study are a questionnaire on demographics, lifestyle, body features and work ability index while mean rating, standard deviation and t test were used for data analysis. Identified health related hazard include damages to the eyesight, bad posture, arthritis, musculoskeletal disorders, headache, stress and so on. The results showed that factors such as work demand, posture, closeness to computer screen and excessive working hours on computers constitute health hazards in both old and young computer users of various gender. It is therefore recommended that total number of hours spent with computer should be monitored and controlled.

Keywords: computer game, metaphor, middle school students, virtual environments computer auditing, risk, measures to prevent, information management computer-related health hazard, musculoskeletal disorders, computer usage, work ability index

Procedia PDF Downloads 11
3892 Utilization of Online Risk Mapping Techniques versus Desktop Geospatial Tools in Making Multi-Hazard Risk Maps for Italy

Authors: Seyed Vahid Kamal Alavi

Abstract:

Italy has experienced a notable quantity and impact of disasters due to natural hazards and technological accidents caused by diverse risk sources on its physical, technological, and human/sociological infrastructures during past decade. This study discusses the frequency and impacts of the most three physical devastating natural hazards in Italy for the period 2000–2013. The approach examines the reliability of a range of open source WebGIS techniques versus a proposed multi-hazard risk management methodology. Spatial and attribute data which include USGS publically available hazard data and thirteen years Munich RE recorded data for Italy with different severities have been processed, visualized in a GIS (Geographic Information System) framework. Comparison of results from the study showed that the multi-hazard risk maps generated using open source techniques do not provide a reliable system to analyze the infrastructures losses in respect to national risk sources while they can be adopted for general international risk management purposes. Additionally, this study establishes the possibility to critically examine and calibrate different integrated techniques in evaluating what better protection measures can be taken in an area.

Keywords: multi-hazard risk mapping, risk management, GIS, Italy

Procedia PDF Downloads 331
3891 Seismic Microzonation of El-Fayoum New City, Egypt

Authors: Suzan Salem, Heba Moustafa, Abd El-Aziz Abd El-Aal

Abstract:

Seismic micro hazard zonation for urban areas is the first step towards a seismic risk analysis and mitigation strategy. Essential here is to obtain a proper understanding of the local subsurface conditions and to evaluate ground-shaking effects. In the present study, an attempt has been made to evaluate the seismic hazard considering local site effects by carrying out detailed geotechnical and geophysical site characterization in El-Fayoum New City. Seismic hazard analysis and microzonation of El-Fayoum New City are addressed in three parts: in the first part, estimation of seismic hazard is done using seismotectonic and geological information. The second part deals with site characterization using geotechnical and shallow geophysical techniques. In the last part, local site effects are assessed by carrying out one-dimensional (1-D) ground response analysis using the equivalent linear method by program SHAKE 2000. Finally, microzonation maps have been prepared. The detailed methodology, along with experimental details, collected data, results and maps are presented in this paper.

Keywords: El-Fayoum, microzonation, seismotectonic, Egypt

Procedia PDF Downloads 345
3890 Prognostic Value of Tumor Markers in Younger Patients with Breast Cancer

Authors: Lola T. Alimkhodjaeva, Lola T. Zakirova, Soniya S. Ziyavidenova

Abstract:

Background: Breast cancer occupies the first place among the cancer in women in the world. It is urgent today to study the role of molecular markers which are capable of predicting the dynamics and outcome of the disease. The aim of this study is to define the prognostic value of the content of estrogen receptor (ER), progesterone receptor (PgR), and amplification of HER-2 / neu oncoprotein by studying 3 and 5-year overall and relapse-free survival in 470 patients with primary operable and 280 patients with locally–advanced breast cancer. Materials and methods: Study results of 3 and 5-year overall and relapse-free survival, depending on the content of RE, PgR in primary operable patients showed that ER positive (+) and PgR (+) survival was 100 (96.2%) and 97.3 (94.6%), for ER negative (-) and PgR (-) - 69.2 (60.3%) and 65.4 (57.7%), for ER positive (+) and negative PgR (-) 87.4 (80.1%) and 81.5 (79.3%), for ER negative (-) and positive PgR (+) - 97.4 (93.4%) and 90.4 (88.5%), respectively. Survival results depended also on the level of HER-2 / neu expression. In patients with HER-2 / neu negative the survival rates were as follows: 98.6 (94.7%) and 96.2 (92.3%). In group of patients with the level of HER-2 / neu (2+) expression these figures were: 45.3 (44.3%) and 45.1 (40.2%), and in group of patients with the level of HER-2 / neu (3+) expression - 41.2 (33.1%) and 34.3 (29.4%). The combination of ER negative (-), PgR (-), HER-2 / neu (-) they were 27.2 (25.4%) and 19.5 (15.3%), respectively. In patients with locally-advanced breast cancer the results of 3 and 5-year OS and RFS for ER (+) and PgR (+) were 76.3 (69.3%) and 62.2 (61.4%), for ER (-) and RP (-) 29.1 (23.7%) and 18.3 (12.6%), for ER (+) and PgR (-) 61.2 (47.2%) and 39.4 (25.6%), for ER (-) and PgR (+) 54.3 (43.1%) and 41.3 (18.3%), respectively. The level of HER-2 / neu expression also affected the survival results. Therefore, in HER-2/ neu negative patients the survival rate was 74.1 (67.6%) and 65.1 (57.3%), with the level of expression (2+) 20.4 (14.2%) and 8.6 (6.4%), with the level of expression (3+) 6.2 (3.1%) and 1.2 (1.5%), respectively. The combination for ER, PgR, HER-2 / neu negative was 22.1 (14.3%) and 8.4 (1.2%). Conclusion: Thus, the presence of steroid hormone receptors in breast tumor tissues at primary operable and locally- advanced process as the lack of HER-2/neu oncoprotein correlates with the highest rates of 3- and 5-year overall and relapse-free survival. The absence of steroid hormone receptors as well as of HER-2/neu overexpression in malignant breast tissues significantly degrades the 3- and 5-year overall and relapse-free survival. Tumors with ER, PgR and HER-2/neu negative have the most unfavorable prognostics.

Keywords: breast cancer, estrogen receptor, oncoprotein, progesterone receptor

Procedia PDF Downloads 145
3889 Comparative Survival Rates of Yeasts during Freeze-Drying, Traditional Drying and Spray Drying

Authors: Latifa Hamoudi-Belarbi, L'Hadi Nouri, Khaled Belkacemi

Abstract:

The effect of three methods of drying (traditional drying, freeze-drying and spray-drying) on the survival of concentrated cultures of Geotrichum fragrans and Wickerhamomyces anomalus was studied. The survival of yeast cultures was initially compared immediately after freeze-drying using HES 12%(w/v)+Sucrose 7% (w/v) as protectant, traditional drying in dry rice cakes and finally spray-drying with whey proteins. The survival of G. fragrans and W. anomalus was studied during 4 months of storage at 4°C and 25°C, in the darkness, under vacuum and at 0% relative humidity. The results demonstrated that high survival was obtained using traditional method of preservation in rice cakes (60% for G. fragrans and 65% for W. anomalus) and freeze-drying in (68% for G. fragrans and 74% for W. anomalus). However, poor survival was obtained by spray-drying method in whey protein with 20% for G. fragrans and 29% for W. anomalus. During storage at 25°C, yeast cultures of G. fragrans and W. anomalus preserved by traditional and freeze-drying methods showed no significant loss of viable cells up to 3 months of storage. Spray-dried yeast cultures had the greatest loss of viable count during the 4 months of storage at 25°C. During storage at 4°C, preservation of yeasts cultures using traditional method of preservation provided better survival than freeze-drying. This study demonstrated the effectiveness of the traditional method to preserve yeasts cultures compared to the high cost methods like freeze-drying and spray-drying.

Keywords: freeze-drying, traditional drying, spray drying, yeasts

Procedia PDF Downloads 446
3888 Comparison of Parametric and Bayesian Survival Regression Models in Simulated and HIV Patient Antiretroviral Therapy Data: Case Study of Alamata Hospital, North Ethiopia

Authors: Zeytu G. Asfaw, Serkalem K. Abrha, Demisew G. Degefu

Abstract:

Background: HIV/AIDS remains a major public health problem in Ethiopia and heavily affecting people of productive and reproductive age. We aimed to compare the performance of Parametric Survival Analysis and Bayesian Survival Analysis using simulations and in a real dataset application focused on determining predictors of HIV patient survival. Methods: A Parametric Survival Models - Exponential, Weibull, Log-normal, Log-logistic, Gompertz and Generalized gamma distributions were considered. Simulation study was carried out with two different algorithms that were informative and noninformative priors. A retrospective cohort study was implemented for HIV infected patients under Highly Active Antiretroviral Therapy in Alamata General Hospital, North Ethiopia. Results: A total of 320 HIV patients were included in the study where 52.19% females and 47.81% males. According to Kaplan-Meier survival estimates for the two sex groups, females has shown better survival time in comparison with their male counterparts. The median survival time of HIV patients was 79 months. During the follow-up period 89 (27.81%) deaths and 231 (72.19%) censored individuals registered. The average baseline cluster of differentiation 4 (CD4) cells count for HIV/AIDS patients were 126.01 but after a three-year antiretroviral therapy follow-up the average cluster of differentiation 4 (CD4) cells counts were 305.74, which was quite encouraging. Age, functional status, tuberculosis screen, past opportunistic infection, baseline cluster of differentiation 4 (CD4) cells, World Health Organization clinical stage, sex, marital status, employment status, occupation type, baseline weight were found statistically significant factors for longer survival of HIV patients. The standard error of all covariate in Bayesian log-normal survival model is less than the classical one. Hence, Bayesian survival analysis showed better performance than classical parametric survival analysis, when subjective data analysis was performed by considering expert opinions and historical knowledge about the parameters. Conclusions: Thus, HIV/AIDS patient mortality rate could be reduced through timely antiretroviral therapy with special care on the potential factors. Moreover, Bayesian log-normal survival model was preferable than the classical log-normal survival model for determining predictors of HIV patients survival.

Keywords: antiretroviral therapy (ART), Bayesian analysis, HIV, log-normal, parametric survival models

Procedia PDF Downloads 149
3887 Fuzzy Control and Pertinence Functions

Authors: Luiz F. J. Maia

Abstract:

This paper presents an approach to fuzzy control, with the use of new pertinence functions, applied in the case of an inverted pendulum. Appropriate definitions of pertinence functions to fuzzy sets make possible the implementation of the controller with only one control rule, resulting in a smooth control surface. The fuzzy control system can be implemented with analog devices, affording a true real-time performance.

Keywords: control surface, fuzzy control, Inverted pendulum, pertinence functions

Procedia PDF Downloads 405
3886 Flood Hazard and Risk Mapping to Assess Ice-Jam Flood Mitigation Measures

Authors: Karl-Erich Lindenschmidt, Apurba Das, Joel Trudell, Keanne Russell

Abstract:

In this presentation, we explore options for mitigating ice-jam flooding along the Athabasca River in western Canada. Not only flood hazard, expressed in this case as the probability of flood depths and extents being exceeded, but also flood risk, in which annual expected damages are calculated. Flood risk is calculated, which allows a cost-benefit analysis to be made so that decisions on the best mitigation options are not based solely on flood hazard but also on the costs related to flood damages and the benefits of mitigation. The river ice model is used to simulate extreme ice-jam flood events with which scenarios are run to determine flood exposure and damages in flood-prone areas along the river. We will concentrate on three mitigation options – the placement of a dike, artificial breakage of the ice cover along the river, the installation of an ice-control structure, and the construction of a reservoir. However, any mitigation option is not totally failsafe. For example, dikes can still be overtopped and breached, and ice jams may still occur in areas of the river where ice covers have been artificially broken up. Hence, for all options, it is recommended that zoning of building developments away from greater flood hazard areas be upheld. Flood mitigation can have a negative effect of giving inhabitants a false sense of security that flooding may not happen again, leading to zoning policies being relaxed. (Text adapted from Lindenschmidt [2022] "Ice Destabilization Study - Phase 2", submitted to the Regional Municipality of Wood Buffalo, Alberta, Canada)

Keywords: ice jam, flood hazard, flood risk river ice modelling, flood risk

Procedia PDF Downloads 131
3885 A New Approach for Generalized First Derivative of Nonsmooth Functions Using Optimization

Authors: Mohammad Mehdi Mazarei, Ali Asghar Behroozpoor

Abstract:

In this paper, we define an optimization problem corresponding to smooth and nonsmooth functions which its optimal solution is the first derivative of these functions in a domain. For this purpose, a linear programming problem corresponding to optimization problem is obtained. The optimal solution of this linear programming problem is the approximate generalized first derivative. In fact, we approximate generalized first derivative of nonsmooth functions as tailor series. We show the efficiency of our approach by some smooth and nonsmooth functions in some examples.

Keywords: general derivative, linear programming, optimization problem, smooth and nonsmooth functions

Procedia PDF Downloads 514
3884 Songs from the Cradle: An Analysis of Some Selected Nupe Songs

Authors: Zainab Zendana Shafii

Abstract:

Lullabies have been broadly defined as songs that are sung to calm and soothe children. While this is true, this paper intends to show that lullabies exceed these functions. The paper, in exploring Nupe lullabies, examines the various functions that lullabies perform in terms of language development, cultural enrichment and also the retelling of history as it relates to the culture of the Nupe people of northern Nigeria. The theoretical framework used is the functionalist theory. This theory postulates that all cultural or social phenomena have a positive function and that all are indispensable. The functionalist theory is based on the premise that all aspects of a society—institutions, roles, norms, etc.—serve a purpose and that all are indispensable for the long-term survival of the society. To this end, this paper dissects the various lullabies in Nupeland with a view to exploring the meaning that these songs generate and why they are even sung at all. The qualitative research methodology has been used to gather materials.

Keywords: Nupe, lullabies, Nigeria, northern

Procedia PDF Downloads 143
3883 Generalization of Tsallis Entropy from a Q-Deformed Arithmetic

Authors: J. Juan Peña, J. Morales, J. García-Ravelo, J. García-Martínes

Abstract:

It is known that by introducing alternative forms of exponential and logarithmic functions, the Tsallis entropy Sq is itself a generalization of Shannon entropy S. In this work, from a deformation through a scaling function applied to the differential operator, it is possible to generate a q-deformed calculus as well as a q-deformed arithmetic, which not only allows generalizing the exponential and logarithmic functions but also any other standard function. The updated q-deformed differential operator leads to an updated integral operator under which the functions are integrated together with a weight function. For each differentiable function, it is possible to identify its q-deformed partner, which is useful to generalize other algebraic relations proper of the original functions. As an application of this proposal, in this work, a generalization of exponential and logarithmic functions is studied in such a way that their relationship with the thermodynamic functions, particularly the entropy, allows us to have a q-deformed expression of these. As a result, from a particular scaling function applied to the differential operator, a q-deformed arithmetic is obtained, leading to the generalization of the Tsallis entropy.

Keywords: q-calculus, q-deformed arithmetic, entropy, exponential functions, thermodynamic functions

Procedia PDF Downloads 15
3882 Component Level Flood Vulnerability Framework for the United Kingdom

Authors: Mohammad Shoraka, Francesco Preti, Karen Angeles, Raulina Wojtkiewicz, Karthik Ramanathan

Abstract:

Catastrophe modeling has evolved significantly over the last four decades. Verisk introduced its pioneering comprehensive inland flood model tailored for the U.K. in 2008. Over the course of the last 15 years, Verisk has built a suite of physically driven flood models for several countries and regions across the globe. This paper aims to spotlight a selection of these advancements tailored to the development of vulnerability estimation, which forms an integral part of a forthcoming update to Verisk’s U.K. inland flood model. Vulnerability functions are critical to evaluating and robust modeling flood-induced damage to buildings and contents. The subsequent damage assessments then allow for direct quantification of losses for entire building portfolios. Notably, today’s flood loss models more often prioritize enhanced development of hazard characterization, while vulnerability functions often lack sufficient granularity for a robust assessment. This study proposes a novel, engineering-driven, physically based component-level flood vulnerability framework for the U.K. Various aspects of the framework, including component classification and comprehensive cost analysis, meticulously tailored to capture the distinct building characteristics unique to the U.K., will be discussed. This analysis will elucidate how the cost distribution across individual components contributes to translating component-level damage functions into building-level damage functions. Furthermore, a succinct overview of essential datasets employed to gauge building regional vulnerability will be highlighted.

Keywords: catastrophe modeling, inland flood, vulnerability, cost analysis

Procedia PDF Downloads 24
3881 Dividend Policy, Overconfidence and Moral Hazard

Authors: Richard Fairchild, Abdullah Al-Ghazali, Yilmaz Guney

Abstract:

This study analyses the relationship between managerial overconfidence, dividends, and firm value by developing theoretical models that examine the condition under which managerial overconfident, dividends, and firm value may be positive or negative. Furthermore, the models incorporate moral hazard, in terms of managerial effort shirking, and the potential for the manager to choose negative NPV projects, due to private benefits. Our models demonstrate that overconfidence can lead to higher dividends (when the manager is overconfident about his current ability) or lower dividends (when the manager is overconfident about his future ability). The models also demonstrate that higher overconfidence may result in an increase or a decrease in firm value. Numerical examples are illustrated for both models which interestingly support the models’ propositions.

Keywords: behavioural corporate finance, dividend policy, overconfidence, moral hazard

Procedia PDF Downloads 296
3880 Survival and Growth Factors of Korean Start-Ups: Focusing on the Industrial Characteristics

Authors: Hanei Son

Abstract:

Since the beginning of the 2010s, ‘start-up boom’ has continued with the creation of many new enterprises in Korea. Such tendency was led by various changes in society such as emergence and diffusion of smartphones. Especially, the Korean government has been interested in start-ups and entrepreneurship as an alternative engine for Korea's economic growth. With strong support from the government, as a result, many new enterprises have been established for recent years and the Korean government seems to have achieved its goal: expanding the basis of start-ups. However, it is unclear which factors affect the survival and growth of these new enterprises after their creation. Therefore, this study aims to identify which start-ups from early 2010s survived and which factors influenced their survival and growth. The study will strongly focus on which industries the new enterprises were in, as environmental elements are expected to be critical factors for business of start-ups in Korean context. For this purpose, 105 companies which were introduced as high potential start-ups from 2010 to 2012 were considered in the analysis. According to their current status, dead or alive, the start-ups were categorized by their industries and service area. Through this analysis, it was observed that many start-ups that are still in business are in internet or mobile platform businesses and four major sectors. In each group, a representative case has been studied to reveal its survival and growth factors. The results point to the importance of industrial characteristics for the survival and success of Korean startups and offer political implications in which sector and business more potentials for start-ups in Korea lie in.

Keywords: government support for start-ups, industrial characteristics, Korean start-ups, survival of start-ups

Procedia PDF Downloads 152
3879 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 691
3878 Analysis of Rockfall Hazard along Himalayan Road Cut Slopes

Authors: Sarada Prasad Pradhan, Vikram Vishal, Tariq Siddique

Abstract:

With a vast area of India comprising of hilly terrain and road cut slopes, landslides and rockfalls are a common phenomenon. However, while landslide studies have received much attention in the past in India, very little literature and analysis is available regarding rockfall hazard of many rockfall prone areas, specifically in Uttarakhand Himalaya, India. The subsequent lack of knowledge and understanding of the rockfall phenomenon as well as frequent incidences of rockfall led fatalities urge the necessity of conducting site-specific rockfall studies to highlight the importance of addressing this issue as well as to provide data for safe design of preventive structures. The present study has been conducted across 10 rockfall prone road cut slopes for a distance of 15 km starting from Devprayag, India along National Highway 58 (NH-58). In order to make a qualitative assessment of Rockfall Hazard posed by these slopes, Rockfall Hazard Rating using standards for Indian Rockmass has been conducted at 10 locations under different slope conditions. Moreover, to accurately predict the characteristics of the possible rockfall phenomenon, numerical simulation was carried out to calculate the maximum bounce heights, total kinetic energies, translational velocities and trajectories of the falling rockmass blocks when simulated on each of these slopes according to real-life conditions. As it was observed that varying slope geometry had more fatal impacts on Rockfall hazard than size of rock masses, several optimizations have been suggested for each slope regarding location of barriers and modification of slope geometries in order to minimize damage by falling rocks. This study can be extremely useful in emphasizing the significance of rockfall studies and construction of mitigative barriers and structures along NH-58 around Devprayag.

Keywords: rockfall, slope stability, rockmass, hazard

Procedia PDF Downloads 172
3877 An Extension of the Generalized Extreme Value Distribution

Authors: Serge Provost, Abdous Saboor

Abstract:

A q-analogue of the generalized extreme value distribution which includes the Gumbel distribution is introduced. The additional parameter q allows for increased modeling flexibility. The resulting distribution can have a finite, semi-infinite or infinite support. It can also produce several types of hazard rate functions. The model parameters are determined by making use of the method of maximum likelihood. It will be shown that it compares favourably to three related distributions in connection with the modeling of a certain hydrological data set.

Keywords: extreme value theory, generalized extreme value distribution, goodness-of-fit statistics, Gumbel distribution

Procedia PDF Downloads 301
3876 Communication of Expected Survival Time to Cancer Patients: How It Is Done and How It Should Be Done

Authors: Geir Kirkebøen

Abstract:

Most patients with serious diagnoses want to know their prognosis, in particular their expected survival time. As part of the informed consent process, physicians are legally obligated to communicate such information to patients. However, there is no established (evidence based) ‘best practice’ for how to do this. The two questions explored in this study are: How do physicians communicate expected survival time to patients, and how should it be done? We explored the first, descriptive question in a study with Norwegian oncologists as participants. The study had a scenario and a survey part. In the scenario part, the doctors should imagine that a patient, recently diagnosed with a serious cancer diagnosis, has asked them: ‘How long can I expect to live with such a diagnosis? I want an honest answer from you!’ The doctors should assume that the diagnosis is certain, and that from an extensive recent study they had optimal statistical knowledge, described in detail as a right-skewed survival curve, about how long such patients with this kind of diagnosis could be expected to live. The main finding was that very few of the oncologists would explain to the patient the variation in survival time as described by the survival curve. The majority would not give the patient an answer at all. Of those who gave an answer, the typical answer was that survival time varies a lot, that it is hard to say in a specific case, that we will come back to it later etc. The survey part of the study clearly indicates that the main reason why the oncologists would not deliver the mortality prognosis was discomfort with its uncertainty. The scenario part of the study confirmed this finding. The majority of the oncologists explicitly used the uncertainty, the variation in survival time, as a reason to not give the patient an answer. Many studies show that patients want realistic information about their mortality prognosis, and that they should be given hope. The question then is how to communicate the uncertainty of the prognosis in a realistic and optimistic – hopeful – way. Based on psychological research, our hypothesis is that the best way to do this is by explicitly describing the variation in survival time, the (usually) right skewed survival curve of the prognosis, and emphasize to the patient the (small) possibility of being a ‘lucky outlier’. We tested this hypothesis in two scenario studies with lay people as participants. The data clearly show that people prefer to receive expected survival time as a median value together with explicit information about the survival curve’s right skewedness (e.g., concrete examples of ‘positive outliers’), and that communicating expected survival time this way not only provides people with hope, but also gives them a more realistic understanding compared with the typical way expected survival time is communicated. Our data indicate that it is not the existence of the uncertainty regarding the mortality prognosis that is the problem for patients, but how this uncertainty is, or is not, communicated and explained.

Keywords: cancer patients, decision psychology, doctor-patient communication, mortality prognosis

Procedia PDF Downloads 288
3875 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 150
3874 Deep Learning Approach for Chronic Kidney Disease Complications

Authors: Mario Isaza-Ruget, Claudia C. Colmenares-Mejia, Nancy Yomayusa, Camilo A. González, Andres Cely, Jossie Murcia

Abstract:

Quantification of risks associated with complications development from chronic kidney disease (CKD) through accurate survival models can help with patient management. A retrospective cohort that included patients diagnosed with CKD from a primary care program and followed up between 2013 and 2018 was carried out. Time-dependent and static covariates associated with demographic, clinical, and laboratory factors were included. Deep Learning (DL) survival analyzes were developed for three CKD outcomes: CKD stage progression, >25% decrease in Estimated Glomerular Filtration Rate (eGFR), and Renal Replacement Therapy (RRT). Models were evaluated and compared with Random Survival Forest (RSF) based on concordance index (C-index) metric. 2.143 patients were included. Two models were developed for each outcome, Deep Neural Network (DNN) model reported C-index=0.9867 for CKD stage progression; C-index=0.9905 for reduction in eGFR; C-index=0.9867 for RRT. Regarding the RSF model, C-index=0.6650 was reached for CKD stage progression; decreased eGFR C-index=0.6759; RRT C-index=0.8926. DNN models applied in survival analysis context with considerations of longitudinal covariates at the start of follow-up can predict renal stage progression, a significant decrease in eGFR and RRT. The success of these survival models lies in the appropriate definition of survival times and the analysis of covariates, especially those that vary over time.

Keywords: artificial intelligence, chronic kidney disease, deep neural networks, survival analysis

Procedia PDF Downloads 88
3873 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment

Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai

Abstract:

Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.

Keywords: computational methods, MATLAB, seismic hazard, seismic measurements

Procedia PDF Downloads 296
3872 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 100
3871 Numerical Solution for Integro-Differential Equations by Using Quartic B-Spline Wavelet and Operational Matrices

Authors: Khosrow Maleknejad, Yaser Rostami

Abstract:

In this paper, semi-orthogonal B-spline scaling functions and wavelets and their dual functions are presented to approximate the solutions of integro-differential equations.The B-spline scaling functions and wavelets, their properties and the operational matrices of derivative for this function are presented to reduce the solution of integro-differential equations to the solution of algebraic equations. Here we compute B-spline scaling functions of degree 4 and their dual, then we will show that by using them we have better approximation results for the solution of integro-differential equations in comparison with less degrees of scaling functions.

Keywords: ıntegro-differential equations, quartic B-spline wavelet, operational matrices, dual functions

Procedia PDF Downloads 414
3870 High Accuracy Analytic Approximation for Special Functions Applied to Bessel Functions J₀(x) and Its Zeros

Authors: Fernando Maass, Pablo Martin, Jorge Olivares

Abstract:

The Bessel function J₀(x) is very important in Electrodynamics and Physics, as well as its zeros. In this work, a method to obtain high accuracy approximation is presented through an application to that function. In most of the applications of this function, the values of the zeros are very important. In this work, analytic approximations for this function have been obtained valid for all positive values of the variable x, which have high accuracy for the function as well as for the zeros. The approximation is determined by the simultaneous used of the power series and asymptotic expansion. The structure of the approximation is a combination of two rational functions with elementary functions as trigonometric and fractional powers. Here us in Pade method, rational functions are used, but now there combined with elementary functions us fractional powers hyperbolic or trigonometric functions, and others. The reason of this is that now power series of the exact function are used, but together with the asymptotic expansion, which usually includes fractional powers trigonometric functions and other type of elementary functions. The approximation must be a bridge between both expansions, and this can not be accomplished using only with rational functions. In the simplest approximation using 4 parameters the maximum absolute error is less than 0.006 at x ∼ 4.9. In this case also the maximum relative error for the zeros is less than 0.003 which is for the second zero, but that value decreases rapidly for the other zeros. The same kind of behaviour happens for the relative error of the maximum and minimum of the functions. Approximations with higher accuracy and more parameters will be also shown. All the approximations are valid for any positive value of x, and they can be calculated easily.

Keywords: analytic approximations, asymptotic approximations, Bessel functions, quasirational approximations

Procedia PDF Downloads 210
3869 A Qualitative Case Study Exploring Zambian Mathematics Teachers' Content Knowledge of Functions

Authors: Priestly Malambo, Sonja Van Putten, Hanlie Botha, Gerrit Stols

Abstract:

The relevance of what is content is taught in tertiary teacher training has long been in question. This study attempts to understand how advanced mathematics courses equip student teachers to teach functions at secondary school level. This paper reports on an investigation that was conducted in an African university, where preservice teachers were purposefully selected for participation in individual semi-structured interviews after completing a test on functions as taught at secondary school. They were asked to justify their reasoning in the test and to explain functions in a way that might bring about understanding of the topic in someone who did not know how functions work. These were final year preservice mathematics teachers who had studied advanced mathematics courses for three years. More than 50% of the students were not able to explain concepts or to justify their reasoning about secondary school functions in a coherent way. The results of this study suggest that the study of advanced mathematics does not automatically enable students to teach secondary school functions, and that, although these students were able to do advanced mathematics, they were unable to explain the working of functions in a way that would allow them to teach this topic successfully.

Keywords: secondary school, mathematical reasoning, student-teachers, functions

Procedia PDF Downloads 229
3868 Recurrent Neural Networks for Complex Survival Models

Authors: Pius Marthin, Nihal Ata Tutkun

Abstract:

Survival analysis has become one of the paramount procedures in the modeling of time-to-event data. When we encounter complex survival problems, the traditional approach remains limited in accounting for the complex correlational structure between the covariates and the outcome due to the strong assumptions that limit the inference and prediction ability of the resulting models. Several studies exist on the deep learning approach to survival modeling; moreover, the application for the case of complex survival problems still needs to be improved. In addition, the existing models need to address the data structure's complexity fully and are subject to noise and redundant information. In this study, we design a deep learning technique (CmpXRnnSurv_AE) that obliterates the limitations imposed by traditional approaches and addresses the above issues to jointly predict the risk-specific probabilities and survival function for recurrent events with competing risks. We introduce the component termed Risks Information Weights (RIW) as an attention mechanism to compute the weighted cumulative incidence function (WCIF) and an external auto-encoder (ExternalAE) as a feature selector to extract complex characteristics among the set of covariates responsible for the cause-specific events. We train our model using synthetic and real data sets and employ the appropriate metrics for complex survival models for evaluation. As benchmarks, we selected both traditional and machine learning models and our model demonstrates better performance across all datasets.

Keywords: cumulative incidence function (CIF), risk information weight (RIW), autoencoders (AE), survival analysis, recurrent events with competing risks, recurrent neural networks (RNN), long short-term memory (LSTM), self-attention, multilayers perceptrons (MLPs)

Procedia PDF Downloads 43
3867 A Study on the Waiting Time for the First Employment of Arts Graduates in Sri Lanka

Authors: Imali T. Jayamanne, K. P. Asoka Ramanayake

Abstract:

Transition from tertiary level education to employment is one of the challenges that many fresh university graduates face after graduation. The transition period or the waiting time to obtain the first employment varies with the socio-economic factors and the general characteristics of a graduate. Compared to other fields of study, Arts graduates in Sri Lanka, have to wait a long time to find their first employment. The objective of this study is to identify the determinants of the transition from higher education to employment of these graduates using survival models. The study is based on a survey that was conducted in the year 2016 on a stratified random sample of Arts graduates from Sri Lankan universities who had graduated in 2012. Among the 469 responses, 36 (8%) waiting times were interval censored and 13 (3%) were right censored. Waiting time for the first employment varied between zero to 51 months. Initially, the log-rank and the Gehan-Wilcoxon tests were performed to identify the significant factors. Gender, ethnicity, GCE Advanced level English grade, civil status, university, class received, degree type, sector of first employment, type of first employment and the educational qualifications required for the first employment were significant at 10%. The Cox proportional hazards model was fitted to model the waiting time for first employment with these significant factors. All factors, except ethnicity and type of employment were significant at 5%. However, since the proportional hazard assumption was violated, the lognormal Accelerated failure time (AFT) model was fitted to model the waiting time for the first employment. The same factors were significant in the AFT model as in Cox proportional model.

Keywords: AFT model, first employment, proportional hazard, survey design, waiting time

Procedia PDF Downloads 265