Search results for: probability of failure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3415

Search results for: probability of failure

3025 The Association of Slope Failure and Lineament Density along the Ranau-Tambunan Road, Sabah, Malaysia

Authors: Norbert Simon, Rodeano Roslee, Abdul Ghani Rafek, Goh Thian Lai, Azimah Hussein, Lee Khai Ern

Abstract:

The 54 km stretch of Ranau-Tambunan (RTM) road in Sabah is subjected to slope failures almost every year. This study is focusing on identifying section of roads that are susceptible to failure based on temporal landslide density and lineament density analyses. In addition to the analyses, the rock slopes in several sections of the road were assessed using the geological strength index (GSI) technique. The analysis involved 148 landslides that were obtained in 1978, 1994, 2009 and 2011. The landslides were digitized as points and the point density was calculated based on every 1km2 of the road. The lineaments of the area was interpreted from Landsat 7 15m panchromatic band. The lineament density was later calculated based on every 1km2 of the area using similar technique with the slope failure density calculation. The landslide and lineament densities were classified into three different classes that indicate the level of susceptibility (low, moderate, high). Subsequently, the two density maps were overlap to produce the final susceptibility map. The combination of both high susceptibility classes from these maps signifies the high potential of slope failure in those locations in the future. The final susceptibility map indicates that there are 22 sections of the road that are highly susceptible. Seven rock slopes were assessed along the RTM road using the GSI technique. It was found from the assessment that rock slopes along this road are highly fractured, weathered and can be classified into fair to poor categories. The poor condition of the rock slope can be attributed to the high lineament density that presence in the study area. Six of the rock slopes are located in the high susceptibility zones. A detailed investigation on the 22 high susceptibility sections of the RTM road should be conducted due to their higher susceptibility to failure, in order to prevent untoward incident to road users in the future.

Keywords: GSI, landslide, landslide density, landslide susceptibility, lineament density

Procedia PDF Downloads 372
3024 River Bank Erosion Studies: A Review on Investigation Approaches and Governing Factors

Authors: Azlinda Saadon

Abstract:

This paper provides detail review on river bank erosion studies with respect to their processes, methods of measurements and factors governing river bank erosion. Bank erosion processes are commonly associated with river changes initiation and development, through width adjustment and planform evolution. It consists of two main types of erosion processes; basal erosion due to fluvial hydraulic force and bank failure under the influence of gravity. Most studies had only focused on one factor rather than integrating both factors. Evidences of previous works have shown integration between both processes of fluvial hydraulic force and bank failure. Bank failure is often treated as probabilistic phenomenon without having physical characteristics and the geotechnical aspects of the bank. This review summarizes the findings of previous investigators with respect to measurement techniques and prediction rates of river bank erosion through field investigation, physical model and numerical model approaches. Factors governing river bank erosion considering physical characteristics of fluvial erosion are defined.

Keywords: river bank erosion, bank erosion, dimensional analysis, geotechnical aspects

Procedia PDF Downloads 407
3023 Surprise Fraudsters Before They Surprise You: A South African Telecommunications Case Study

Authors: Ansoné Human, Nantes Kirsten, Tanja Verster, Willem D. Schutte

Abstract:

Every year the telecommunications industry suffers huge losses due to fraud. Mobile fraud, or generally, telecommunications fraud is the utilisation of telecommunication products or services to acquire money illegally from or failing to pay a telecommunication company. A South African telecommunication operator developed two internal fraud scorecards to mitigate future risks of application fraud events. The scorecards aim to predict the likelihood of an application being fraudulent and surprise fraudsters before they surprise the telecommunication operator by identifying fraud at the time of application. The scorecards are utilised in the vetting process to evaluate the applicant in terms of the fraud risk the applicant would present to the telecommunication operator. Telecommunication providers can utilise these scorecards to profile customers, as well as isolate fraudulent and/or high-risk applicants. We provide the complete methodology utilised in the development of the scorecards. Furthermore, a Determination and Discrimination (DD) ratio is provided in the methodology to select the most influential variables from a group of related variables. Throughout the development of these scorecards, the following was revealed regarding fraudulent cases and fraudster behaviour within the telecommunications industry: Fraudsters typically target high-value handsets. Furthermore, debit order dates scheduled for the end of the month have the highest fraud probability. The fraudsters target specific stores. Applicants who acquire an expensive package and receive a medium-income, as well as applicants who obtain an expensive package and receive a high income, have higher fraud percentages. If one month prior to application, the status of an account is already in arrears (two months or more), the applicant has a high probability of fraud. The applicants with the highest average spend on calls have a higher probability of fraud. If the amount collected changes from month to month, the likelihood of fraud is higher. Lastly, young and middle-aged applicants have an increased probability of being targeted by fraudsters than other ages.

Keywords: application fraud scorecard, predictive modeling, regression, telecommunications

Procedia PDF Downloads 97
3022 Cooperative Scheme Using Adjacent Base Stations in Wireless Communication

Authors: Young-Min Ko, Seung-Jun Yu, Chang-Bin Ha, Hyoung-Kyu Song

Abstract:

In a wireless communication system, the failure of base station can result in a communication disruption in the cell. This paper proposes a way to deal with the failure of base station in a wireless communication system based on OFDM. Cooperative communication of the adjacent base stations can be a solution of the problem. High performance is obtained by the configuration of transmission signals which is applied CDD scheme in the cooperative communication. The Cooperative scheme can be a effective solution in case of the particular situation.

Keywords: base station, CDD, OFDM, diversity gain, MIMO

Procedia PDF Downloads 462
3021 Sodium-glucose Co-transporter-2 Inhibitors in Heart Failure with Mildly Reduced Reduced Ejection Fraction: Future Perspectives in Patients with Neoplasia

Authors: M. A. Munteanu, A. M. Lungu, A. I. Chivescu, V. Teodorescu, E. Tufanoiu, C. Nicolae, T. I. Nanea

Abstract:

Introduction: Sodium-glucose co-transporter 2 inhibitors (SGLT2i), which were first developed as antidiabetic medications, have demonstrated numerous positive benefits on the cardiovascular system, especially in the prevention of heart failure (HF). HF is a challenging, multifaceted disease that needs all-encompassing therapy. It should not be viewed as a limited form of heart illness but rather as a systemic disease that leads to multiple organ failure and death. SGLT2i is an extremely effective tool for treating HF by using its pleiotropic effects. In addition to its use in patients with diabetes mellitus who are at high cardiovascular risk or who have already experienced a cardiovascular event, SGLT2i administration has been shown to have positive effects on a variety of HF manifestations and stages, regardless of the patient's presence of diabetes mellitus. Material and Methods: According to the guide, 110 patients (83 males and 27 females) with heart failure with mildly reduced ejection fraction (HFmrEF), with T2D and neoplasia, were enrolled in the prospective study. The structural and functional state of the left ventricle myocardium and ejection fraction was assessed through echocardiography. Patients were randomized to receive once-daily dapagliflozin 10 mg. Results: Patients with HFmrEF were divided into 3 subgroups according to age. 7% (8) patients aged < 45 years, 35% (28) patients aged between 46-59 years, and 58% (74) patients aged> 60 years. The most prevalent comorbidities were hypertension (43.1%), coronary heart disease (40%), and obesity (33.2%). Study drug discontinuation and serious adverse events were not frequent in the subgroups, in either men or women, until now. Conclusions: SGLT-2 inhibitors are a novel class of antidiabetic agents that have demonstrated positive efficacy and safety outcomes in the setting of HFmrEF. Until now, in our study, dapagliflozin was safe and well-tolerated irrespective of sex.

Keywords: diabetes mellitus type 2, Sodium-glucose co-transporters-2 inhibitors, heart failure, neoplasia

Procedia PDF Downloads 62
3020 Global Best Practice Paradox; the Failure of One Size Fits All Approach to Development a Case Study of Pakistan

Authors: Muhammad Naveed Iftikhar, Farah Khalid

Abstract:

Global best practices as ordained by international organizations comprise a broader top-down approach to development problems, without taking into account country-specific factors. The political economy of each country is extremely different and the failure of several attempts of international organizations to implement global best practice models in developing countries each with its unique set of variables, goes on to show that this is not the most efficient solution to development problems. This paper is a humble attempt at shedding light on some specific examples of failures of the global best practices. Pakistan has its unique set of problems and unless those are added to the broader equation of development, country-specific reform and growth will continue to pose a challenge to reform programs initiated by international organizations. The three case studies presented in this paper are just a few prominent examples of failure of the global best practice, top-down, universalistic approach to development as ordained by international organizations. Development and reform can only be achieved if local dynamics are given their due importance. The modus operandi of international organizations needs to be tailored according to each country’s unique politico-economic environment.

Keywords: best practice, development, context

Procedia PDF Downloads 440
3019 A Crossover Study of Therapeutic Equivalence of Generic Product Versus Reference Product of Ivabradine in Patients with Chronic Heart Failure

Authors: Hadeer E. Eliwa, Naglaa S. Bazan, Ebtissam A. Darweesh, Nagwa A. Sabri

Abstract:

Background: Generic substitution of brand ivabradine prescriptions can reduce drug expenditures and improve adherence. However, the distrust of generic medicines by practitioners and patients due to doubts regarding their quality and fear of counterfeiting compromise the acceptance of this practice. Aim: The goal of this study is to compare the therapeutic equivalence of brand product versus the generic product of ivabradine in adult patients with chronic heart failure with reduced ejection fraction (≤ 40%) (HFrEF). Methodology: Thirty-two Egyptian patients with chronic heart failure with reduced ejection fraction (HFrEF) were treated with branded ivabradine (Procrolan ©) and generic (Bradipect ©) during 24 (2x12) weeks. Primary outcomes were resting heart rate (HR), NYHA FC, Quality of life (QoL) using Minnesota Living with Heart Failure (MLWHF) and EF. Secondary outcomes were the number of hospitalizations for worsening HFrEF and adverse effects. The washout period was not allowed. Findings: At the 12th week, the reduction in HR was comparable in the two groups (90.13±7.11 to 69±11.41 vs 96.13±17.58 to 67.31±8.68 bpm in brand and generic groups, respectively). Also, the increase in EF was comparable in the two groups (27.44 ±4.59 to 33.38±5.62 vs 32±5.96 to 39.31±8.95 in brand and generic groups, respectively). The improvement in NYHA FC was comparable in both groups (87.5% in brand group vs 93.8% in the generic group). The mean value of the QOL improved from 31.63±15.8 to 19.6±14.7 vs 35.68±17.63 to 22.9±15.1 for the brand and generic groups, respectively. Similarly, at end of 24 weeks, no significant changes were observed from data observed at 12th week regarding HR, EF, QoL and NYHA FC. Only minor side effects, mainly phosphenes, and a comparable number of hospitalizations were observed in both groups. Conclusion: The study revealed no statistically significant differences in the therapeutic effect and safety between generic and branded ivabradine. We assume that practitioners can safely interchange between them for economic reasons.

Keywords: bradipect©, heart failure, ivabradine, Procrolan ©, therapeutic equivalence

Procedia PDF Downloads 441
3018 Data-Driven Dynamic Overbooking Model for Tour Operators

Authors: Kannapha Amaruchkul

Abstract:

We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.

Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator

Procedia PDF Downloads 111
3017 Model-Based Software Regression Test Suite Reduction

Authors: Shiwei Deng, Yang Bao

Abstract:

In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.

Keywords: dependence analysis, EFSM model, greedy algorithm, regression test

Procedia PDF Downloads 406
3016 Sudden Death of a Cocaine Body Packer: An Autopsy Examination Findings

Authors: Parthasarathi Pramanik

Abstract:

Body packing is a way of transfer drugs across the international border or any drug prohibited area. The drugs are usually hidden in body packets inside the anatomical body cavities like mouth, intestines, rectum, ear, vagina etc. Cocaine is a very common drug for body packing across the world. A 48 year old male was reported dead in his hotel after complaining of chest pain and vomiting. At autopsy, there were eighty-two white cylindrical body packs in the stomach, small and large intestines. Seals of few of the packets were opened. Toxicological examination revealed presence of cocaine in the stomach, liver, kidney and hair samples. Microscopically, presence of myocardial necrosis with interstitial oedema along with hypertrophy and fibrosis of the myocardial fibre suggested heart failure due to cocaine cardio toxicity. However, focal lymphocyte infiltration and perivascular fibrosis in the myocardium also indicated chronic cocaine toxicity of the deceased. After careful autopsy examination it was considered the victim was died due congestive heart failure secondary to acute and chronic cocaine poisoning.

Keywords: cardiac failure, cocaine, body packer, sudden death

Procedia PDF Downloads 298
3015 Consumption of Fat Burners Leads to Acute Liver Failure: A Systematic Review protocol

Authors: Anjana Aggarwal, Sheilja Walia

Abstract:

Prevalence of obesity and overweight is increasing due to sedentary lifestyles and busy schedules of people that spend less time on physical exercise. To reduce weight, people are finding easier and more convenient ways. The easiest solution is the use of dietary supplements and fat burners. These are products that decrease body weight by increasing the basal metabolic rate. Various reports have been published on the consumption of fat burners leading to heart palpitations, seizures, anxiety, depression, psychosis, bradycardia, insomnia, muscle contractions, hepatotoxicity, and even liver failure. Case reports and series are reporting that the ingredients present in the fat burners caused acute liver failure (ALF) and hepatic toxicity in many cases. Another contributing factor is the absence of regulations from the Food and Drug Administration on these products, leading to increased consumption and a higher risk of liver diseases among the population. This systematic review aims to attain a better understanding of the dietary supplements used globally to reduce weight and document the case reports/series of acute liver failure caused by the consumption of fat burners. Electronic databases like PubMed, Cochrane, Google Scholar, etc., will be systematically searched for relevant articles. Various websites of dietary products and brands that sell such supplements, Journals of Hepatology, National and international projects launched for ALF, and their reports, along with the review of grey literature, will also be done to get a better understanding of the topic. After discussing with the co-author, the selection and screening of the articles will be performed by the author. The studies will be selected based on the predefined inclusion and exclusion criteria. The case reports and case series that will be included in the final list of the studies will be assessed for methodological quality using the CARE guidelines. The results from this study will provide insights and a better understanding of fat burners. Since the supplements are easily available in the market without any restrictions on their sale, people are unaware of their adverse effects. The consumption of these supplements causes acute liver failure. Thus, this review will provide a platform for future larger studies to be conducted.

Keywords: acute liver failure, dietary supplements, fat burners, weight loss supplements

Procedia PDF Downloads 61
3014 Software Reliability Prediction Model Analysis

Authors: Lela Mirtskhulava, Mariam Khunjgurua, Nino Lomineishvili, Koba Bakuria

Abstract:

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Keywords: exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability

Procedia PDF Downloads 442
3013 Influence of Model Hydrometeor Form on Probability of Discharge Initiation from Artificial Charged Water Aerosol Cloud

Authors: A. G. Temnikov, O. S. Belova, L. L. Chernensky, T. K. Gerastenok, N. Y. Lysov, A. V. Orlov, D. S. Zhuravkova

Abstract:

Hypothesis of the lightning initiation on the arrays of large hydrometeors are in the consideration. There is no agreement about the form the hydrometeors that could be the best for the lightning initiation from the thundercloud. Artificial charged water aerosol clouds of the positive or negative polarity could help investigate the possible influence of the hydrometeor form on the peculiarities and the probability of the lightning discharge initiation between the thundercloud and the ground. Artificial charged aerosol clouds that could create the electric field strength in the range of 5-6 kV/cm to 16-18 kV/cm have been used in experiments. The array of the model hydrometeors of the volume and plate form has been disposed near the bottom cloud boundary. It was established that the different kinds of the discharge could be initiated in the presence of the model hydrometeors array – from the cloud discharges up to the diffuse and channel discharges between the charged cloud and the ground. It was found that the form of the model hydrometeors could significantly influence the channel discharge initiation from the artificial charged aerosol cloud of the negative or positive polarity correspondingly. Analysis and generalization of the experimental results have shown that the maximal probability of the channel discharge initiation and propagation stimulation has been observed for the artificial charged cloud of the positive polarity when the arrays of the model hydrometeors of the cylinder revolution form have been used. At the same time, for the artificial charged clouds of the negative polarity, application of the model hydrometeor array of the plate rhombus form has provided the maximal probability of the channel discharge formation between the charged cloud and the ground. The established influence of the form of the model hydrometeors on the channel discharge initiation and from the artificial charged water aerosol cloud and its following successful propagation has been related with the different character of the positive and negative streamer and volume leader development on the model hydrometeors array being near the bottom boundary of the charged cloud. The received experimental results have shown the possibly important role of the form of the large hail particles precipitated in thundercloud on the discharge initiation.

Keywords: cloud and channel discharges, hydrometeor form, lightning initiation, negative and positive artificial charged aerosol cloud

Procedia PDF Downloads 292
3012 Paternity Index Analysis on Disputed Paternity Cases at Sardjito Hospital Yogyakarta, Indonesia

Authors: Taufik Hidayat, Yudha Nurhantari, Bambang U. D. Rianto

Abstract:

Introduction: The examination of the Short Tandem Repeats (STR) locus on nuclear DNA is very useful in solving the paternity cases. The purpose of this study is to know the description of paternity cases and paternity index/probability of paternity analysis based on Indonesian allele frequency at Sardjito Hospital Yogyakarta. Method: This was an observational study with cross-sectional analytic method. Population and sample were all cases of disputed paternity from January 2011 to June 2015 that fulfill the inclusion and exclusion criteria and were examined at Forensic Medicine Unit of Sardjito Hospital, Medical Faculty of Gadjah Mada University. The paternity index was calculated with EasyDNA Program by Fung (2013). Analysis of the study was conducted by comparing the results through unpaired categorical test using Kolmogorov-Smirnov test. This study was designed with 95% confidence interval (CI) with α = 5% and significance level is p < 0,05. Results: From 42 disputed paternity cases we obtained trio paternity cases were 32 cases (76.2%) and duo without a mother was 10 cases (23.8%). The majority of the fathers' estimated ages were 21-30 years (33.3%) and the mother's age was 31-40 years (38.1%). The majority of the ages of children examined for paternity were under 12 months (47.6%). The majority of ethnic clients are Javanese. Conclusion of inclusion was 57.1%, and exclusion was 42.9%. The Kolmogorov-Smirnov test obtained p-value = 0.673. Conclusion: There is no significant difference between paternity index/probability of paternity based on Indonesian allele frequency between trio and duo of paternity.

Keywords: disputed paternity, paternity index, probability of paternity, short tandem

Procedia PDF Downloads 151
3011 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment

Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay

Abstract:

Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.

Keywords: machine learning, system performance, performance metrics, IoT, edge

Procedia PDF Downloads 175
3010 Hypocalcaemia Inducing Heart Failure: A Rare Presentation

Authors: A. Kherraf, M. Bouziane, L. Azzouzi, R. Habbal

Abstract:

Introduction: Hypocalcaemia is a rare cause of heart failure. We report the clinical case of a young patient with reversible dilated cardiomyopathy secondary to hypocalcaemia in the context of hyperparathyroidism. Clinical case: We report the clinical case of a 23-year-old patient with a history of thyroidectomy for papillary thyroid carcinoma 3 years previously, who presented to the emergency room with a progressive onset dyspnea and edema of the lower limbs. Clinical examination showed hypotension at 90/70 mmHg, tachycardia at 102 bpm, and edema of the lower limbs. The ECG showed a regular sinus rhythm with a prolonged corrected QT interval to 520ms. The chest x-ray showed cardiomegaly. Echocardiography revealed dilated cardiomyopathy with biventricular dysfunction and a left ventricular ejection fraction of 45%, as well as moderate mitral insufficiency by restriction of the posterior mitral leaflet, moderate tricuspid insufficiency, and a dilated inferior vena cava with a pulmonary arterial pressure estimated at 46 mmHg. Blood tests revealed severe hypocalcemia at 38 mg / l with normal albumin and thyroxine levels, as well as hyperphosphatemia and increased TSH. The patient received calcium intake and vitamin D supplementation and was treated with beta blockers, ACE inhibitors, and diuretics with good progress and progressive normalization of cardiac function. Discussion: The cardiovascular manifestations of hypocalcaemia usually appear with deeply low serum calcium levels. This can lead to hypotension, arrhythmias, ventricular fibrillation, prolonged QT interval, or even heart failure. Heart failure is a rare and serious complication of hypocalcemia but most often characterized by complete normalization of myocardial function after treatment. The etiology of the hypocalcaemia, in this case, was probably related to accidental parathyroid removal during thyroidectomy. This is why careful monitoring of calcium levels is recommended after surgery. Conclusion: Hypocalcemic heart failure is rare but reversible heart disease. Systematic monitoring of serum calcium should be performed in all patients after thyroid surgery to avoid any complications related to hypoparathyroidism.

Keywords: hypocalcemia, heart failure, thyroid surgery, hypoparathyroidism

Procedia PDF Downloads 121
3009 Optimal Sequential Scheduling of Imperfect Maintenance Last Policy for a System Subject to Shocks

Authors: Yen-Luan Chen

Abstract:

Maintenance has a great impact on the capacity of production and on the quality of the products, and therefore, it deserves continuous improvement. Maintenance procedure done before a failure is called preventive maintenance (PM). Sequential PM, which specifies that a system should be maintained at a sequence of intervals with unequal lengths, is one of the commonly used PM policies. This article proposes a generalized sequential PM policy for a system subject to shocks with imperfect maintenance and random working time. The shocks arrive according to a non-homogeneous Poisson process (NHPP) with varied intensity function in each maintenance interval. As a shock occurs, the system suffers two types of failures with number-dependent probabilities: type-I (minor) failure, which is rectified by a minimal repair, and type-II (catastrophic) failure, which is removed by a corrective maintenance (CM). The imperfect maintenance is carried out to improve the system failure characteristic due to the altered shock process. The sequential preventive maintenance-last (PML) policy is defined as that the system is maintained before any CM occurs at a planned time Ti or at the completion of a working time in the i-th maintenance interval, whichever occurs last. At the N-th maintenance, the system is replaced rather than maintained. This article first takes up the sequential PML policy with random working time and imperfect maintenance in reliability engineering. The optimal preventive maintenance schedule that minimizes the mean cost rate of a replacement cycle is derived analytically and determined in terms of its existence and uniqueness. The proposed models provide a general framework for analyzing the maintenance policies in reliability theory.

Keywords: optimization, preventive maintenance, random working time, minimal repair, replacement, reliability

Procedia PDF Downloads 247
3008 Genetic Algorithm Based Node Fault Detection and Recovery in Distributed Sensor Networks

Authors: N. Nalini, Lokesh B. Bhajantri

Abstract:

In Distributed Sensor Networks, the sensor nodes are prone to failure due to energy depletion and some other reasons. In this regard, fault tolerance of network is essential in distributed sensor environment. Energy efficiency, network or topology control and fault-tolerance are the most important issues in the development of next-generation Distributed Sensor Networks (DSNs). This paper proposes a node fault detection and recovery using Genetic Algorithm (GA) in DSN when some of the sensor nodes are faulty. The main objective of this work is to provide fault tolerance mechanism which is energy efficient and responsive to network using GA, which is used to detect the faulty nodes in the network based on the energy depletion of node and link failure between nodes. The proposed fault detection model is used to detect faults at node level and network level faults (link failure and packet error). Finally, the performance parameters for the proposed scheme are evaluated.

Keywords: distributed sensor networks, genetic algorithm, fault detection and recovery, information technology

Procedia PDF Downloads 429
3007 Rupture Probability of Type of Coarse Aggregate on Fracture Surface of Concrete

Authors: B. Ramakrishna, S. Sivamurthy Reddy

Abstract:

The various types of aggregates such as granite, dolerite, Quartzite, dolomitic limestone, limestone and river gravel were used to produce the concrete with 28-day target compressive strength of 35, 60, and 80 Mpa. The compressive strength of concrete, as well as aggregates, was measured to study the effect of rupture probability of aggregate on the fracture surface of the concrete. Also, the petrographic studies were carried out to study the texture, type of minerals present and their relative proportions in various types of aggregates. The concrete of various grades produced with the same aggregate has shown a rise in RPCA with strength. However, the above relationship has ceased to exist in the concretes of the same grade, made of different types of aggregates. The carbonate aggregates namely Limestone and Dolomitic limestone have produced concrete with higher RPCA irrespective of the strength of concrete. The mode of origin, texture and mineralogical composition of aggregates have a significant impact on their pulse velocity and thereby the pulse velocity of concrete.

Keywords: RPCA, DL, G, LS, RG

Procedia PDF Downloads 271
3006 Drug Therapy Problems and Associated Factors among Patients with Heart Failure in the Medical Ward of Arba Minch General Hospital, Ethiopia

Authors: Debalke Dale, Bezabh Geneta, Yohannes Amene, Yordanos Bergene, Mohammed Yimam

Abstract:

Background: A drug therapy problem (DTP) is an event or circumstance that involves drug therapies that actually or potentially interfere with the desired outcome and requires professional judgment to resolve. Heart failure is an emerging worldwide threat whose prevalence and health loss burden constantly increase, especially in the young and in low-to-middle-income countries. There is a lack of population-based incidence and prevalence of heart failure (HF) studies in sub-Saharan African countries, including Ethiopia. Objective: The aim of this study was designed to assess drug therapy problems and associated factors among patients with HF in the medical ward of Arba Minch General Hospital(AGH), Ethiopia, from June 5 to August 20, 2022. Methods: A retrospective cross-sectional study was conducted among 180 patients with HF who were admitted to the medical ward of AGH. Data were collected from patients' cards by using questionnaires. The data were categorized and analyzed by using SPSS version 25.0 software, and data were presented in tables and words based on the nature of the data. Result: Out of the total, 85 (57.6%) were females, and 113 (75.3%) patients were aged over fifty years. Of the 150 study participants, 86 (57.3%) patients had at least one DTP identified, and a total of 116 DTPs were identified, which is 0.77 DTPs per patient. The most common types of DTP were unnecessary drug therapy (32%), followed by the need for additional drug therapy (36%), and dose too low (15%). Patients who used polypharmacy were 5.86 (AOR) times more likely to develop DTPs than those who did not (95% CI = 1.625–16.536, P = 0.005), and patients with more co-morbid conditions developed 3.68 (AOR) times more DTPs than those who had fewer co-morbidities (95% CI = 1.28–10.5, P = 0.015). Conclusion: The results of this study indicated that drug therapy problems were common among medical ward patients with heart failure. These problems are adversely affecting the treatment outcomes of patients, so it requires the special attention of healthcare professionals to optimize them.

Keywords: heart failure, drug therapy problems, Arba Minch general hospital, Ethiopia

Procedia PDF Downloads 77
3005 The Falling Point of Lubricant

Authors: Arafat Husain

Abstract:

The lubricants are one of the most used resource in today’s world. Lot of the superpowers are dependent on the lubricant resource for their country to function. To see that the lubricants are not adulterated we need to develop some efficient ways and to see which fluid has been added to the lubricant. So to observe the these malpractices in the lubricant we need to develop a method. We take a elastic ball and through it at probability circle in the submerged in the lubricant at a fixed force and see the distance of pitching and the point of fall. Then we the ratio of distance of falling to the distance of pitching and if the measured ratio is greater than one the fluid is less viscous and if the ratio is lesser than the lubricant is viscous. We will check the falling point of pure lubricant at fixed force and every pure lubricant would have a fixed falling point. After that we would adulterate the lubricant and note the falling point and if the falling point is less than the standard value then adulterate is solid and if the adulterate is liquid the falling point will be more than the standard value. Hence the comparison with the standard falling point will give the efficiency of the lubricant.

Keywords: falling point of lubricant, falling point ratios, probability circle, octane number

Procedia PDF Downloads 470
3004 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification

Procedia PDF Downloads 258
3003 Stability Analysis of Slopes during Pile Driving

Authors: Yeganeh Attari, Gudmund Reidar Eiksund, Hans Peter Jostad

Abstract:

In Geotechnical practice, there is no standard method recognized by the industry to account for the reduction of safety factor of a slope as an effect of soil displacement and pore pressure build-up during pile installation. Pile driving disturbs causes large strains and generates excess pore pressures in a zone that can extend many diameters from the installed pile, resulting in a decrease of the shear strength of the surrounding soil. This phenomenon may cause slope failure. Moreover, dissipation of excess pore pressure set-up may cause weakening of areas outside the volume of soil remoulded during installation. Because of complex interactions between changes in mean stress and shearing, it is challenging to predict installation induced pore pressure response. Furthermore, it is a complex task to follow the rate and path of pore pressure dissipation in order to analyze slope stability. In cohesive soils it is necessary to implement soil models that account for strain softening in the analysis. In the literature, several cases of slope failure due to pile driving activities have been reported, for instance, a landslide in Gothenburg that resulted in a slope failure destroying more than thirty houses and Rigaud landslide in Quebec which resulted in loss of life. Up to now, several methods have been suggested to predict the effect of pile driving on total and effective stress, pore pressure changes and their effect on soil strength. However, this is still not well understood or agreed upon. In Norway, general approaches applied by geotechnical engineers for this problem are based on old empirical methods with little accurate theoretical background. While the limitations of such methods are discussed, this paper attempts to capture the reduction in the factor of safety of a slope during pile driving, using coupled Finite Element analysis and cavity expansion method. This is demonstrated by analyzing a case of slope failure due to pile driving in Norway.

Keywords: cavity expansion method, excess pore pressure, pile driving, slope failure

Procedia PDF Downloads 126
3002 Mecano-Reliability Coupled of Reinforced Concrete Structure and Vulnerability Analysis: Case Study

Authors: Kernou Nassim

Abstract:

The current study presents a vulnerability and a reliability-mechanical approach that focuses on evaluating the seismic performance of reinforced concrete structures to determine the probability of failure. In this case, the performance function reflecting the non-linear behavior of the structure is modeled by a response surface to establish an analytical relationship between the random variables (strength of concrete and yield strength of steel) and mechanical responses of the structure (inter-floor displacement) obtained by the pushover results of finite element simulations. The push over-analysis is executed by software SAP2000. The results acquired prove that properly designed frames will perform well under seismic loads. It is a comparative study of the behavior of the existing structure before and after reinforcement using the pushover method. The coupling indirect mechanical reliability by response surface avoids prohibitive calculation times. Finally, the results of the proposed approach are compared with Monte Carlo Simulation. The comparative study shows that the structure is more reliable after the introduction of new shear walls.

Keywords: finite element method, surface response, reliability, reliability mechanical coupling, vulnerability

Procedia PDF Downloads 103
3001 Distance Education: Using a Digital Platform to Improve Struggling University Students' Mathematical Skills

Authors: Robert Vanderburg, Nicholas Gibson

Abstract:

Objectives: There has been an increased focus in education students’ mathematics skills in the last two years. Universities have, specifically, had problems teaching students struggling with mathematics. This paper focuses on the ability of a digital platform to significantly improve mathematics skills for struggling students. Methods: 32 students who demonstrated low scores on a mathematics test were selected to take part in a one-month tutorial program using a digital mathematics portal. Students were provided feedback for questions posted on the portal and a fortnightly tutorial session. Results: A pre-test post-test design was analyzed using a one-way analysis of variance (ANOVA). The analysis suggested that students improved skills in algebra, geometry, statistics, probability, ratios, fractions, and probability. Conclusion: Distance university students can improve their mathematics skills using a digital platform.

Keywords: digital education, distance education, higher education, mathematics education

Procedia PDF Downloads 167
3000 Cooperative Spectrum Sensing Using Hybrid IWO/PSO Algorithm in Cognitive Radio Networks

Authors: Deepa Das, Susmita Das

Abstract:

Cognitive Radio (CR) is an emerging technology to combat the spectrum scarcity issues. This is achieved by consistently sensing the spectrum, and detecting the under-utilized frequency bands without causing undue interference to the primary user (PU). In soft decision fusion (SDF) based cooperative spectrum sensing, various evolutionary algorithms have been discussed, which optimize the weight coefficient vector for maximizing the detection performance. In this paper, we propose the hybrid invasive weed optimization and particle swarm optimization (IWO/PSO) algorithm as a fast and global optimization method, which improves the detection probability with a lesser sensing time. Then, the efficiency of this algorithm is compared with the standard invasive weed optimization (IWO), particle swarm optimization (PSO), genetic algorithm (GA) and other conventional SDF based methods on the basis of convergence and detection probability.

Keywords: cognitive radio, spectrum sensing, soft decision fusion, GA, PSO, IWO, hybrid IWO/PSO

Procedia PDF Downloads 437
2999 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: cognitive radio, energy detector, periodogram, spectrum sensing

Procedia PDF Downloads 354
2998 Failure Simulation of Small-scale Walls with Chases Using the Lattic Discrete Element Method

Authors: Karina C. Azzolin, Luis E. Kosteski, Alisson S. Milani, Raquel C. Zydeck

Abstract:

This work aims to represent Numerically tests experimentally developed in reduced scale walls with horizontal and inclined cuts by using the Lattice Discrete Element Method (LDEM) implemented On de Abaqus/explicit environment. The cuts were performed with depths of 20%, 30%, and 50% On the walls subjected to centered and eccentric loading. The parameters used to evaluate the numerical model are its strength, the failure mode, and the in-plane and out-of-plane displacements.

Keywords: structural masonry, wall chases, small scale, numerical model, lattice discrete element method

Procedia PDF Downloads 155
2997 The Integrated Methodological Development of Reliability, Risk and Condition-Based Maintenance in the Improvement of the Thermal Power Plant Availability

Authors: Henry Pariaman, Iwa Garniwa, Isti Surjandari, Bambang Sugiarto

Abstract:

Availability of a complex system of thermal power plant is strongly influenced by the reliability of spare parts and maintenance management policies. A reliability-centered maintenance (RCM) technique is an established method of analysis and is the main reference for maintenance planning. This method considers the consequences of failure in its implementation, but does not deal with further risk of down time that associated with failures, loss of production or high maintenance costs. Risk-based maintenance (RBM) technique provides support strategies to minimize the risks posed by the failure to obtain maintenance task considering cost effectiveness. Meanwhile, condition-based maintenance (CBM) focuses on monitoring the application of the conditions that allow the planning and scheduling of maintenance or other action should be taken to avoid the risk of failure prior to the time-based maintenance. Implementation of RCM, RBM, CBM alone or combined RCM and RBM or RCM and CBM is a maintenance technique used in thermal power plants. Implementation of these three techniques in an integrated maintenance will increase the availability of thermal power plants compared to the use of maintenance techniques individually or in combination of two techniques. This study uses the reliability, risks and conditions-based maintenance in an integrated manner to increase the availability of thermal power plants. The method generates MPI (Priority Maintenance Index) is RPN (Risk Priority Number) are multiplied by RI (Risk Index) and FDT (Failure Defense Task) which can generate the task of monitoring and assessment of conditions other than maintenance tasks. Both MPI and FDT obtained from development of functional tree, failure mode effects analysis, fault-tree analysis, and risk analysis (risk assessment and risk evaluation) were then used to develop and implement a plan and schedule maintenance, monitoring and assessment of the condition and ultimately perform availability analysis. The results of this study indicate that the reliability, risks and conditions-based maintenance methods, in an integrated manner can increase the availability of thermal power plants.

Keywords: integrated maintenance techniques, availability, thermal power plant, MPI, FDT

Procedia PDF Downloads 773
2996 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea

Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das

Abstract:

This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.

Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea

Procedia PDF Downloads 114