Search results for: failure probability
2421 The Effect of Peer Pressure and Leisure Boredom on Substance Use Among Adolescents in Low-Income Communities in Capetown
Authors: Gaironeesa Hendricks, Shazly Savahl, Maria Florence
Abstract:
The aim of the study is to determine whether peer pressure and leisure boredom influence substance use among adolescents in low-income communities in Cape Town. Non-probability sampling was used to select 296 adolescents between the ages of 16–18 from schools located in two low-income communities. The measurement tools included the Drug Use Disorders Identification Test, the Resistance to Peer Influence and Leisure Boredom Scales. Multiple regression revealed that the combined influence of peer pressure and leisure boredom predicted substance use, while peer pressure emerged as a stronger predictor than leisure boredom on substance use among adolescents.Keywords: substance use, peer pressure, leisure boredom, adolescents, multiple regression
Procedia PDF Downloads 5992420 InfoMiracles in the Qur’an and a Mathematical Proof to the Existence of God
Authors: Mohammad Mahmoud Mandurah
Abstract:
The existence of InfoMiracles in scripture is evidence that the scripture has a divine origin. It is also evidence to the existence of God. An InfoMiracle is an information-based miracle. The basic component of an InfoMiracle is a piece of information that could not be obtained by a human except through a divine channel. The existence of a sufficient number of convincing InfoMiracles in a scripture necessitates the existence of the divine source to these InfoMiracles. A mathematical equation is developed to prove that the Qur’an has a divine origin, and hence, prove the existence of God. The equation depends on a single variable only, which is the number of InfoMiracles in the Qur’an. The Qur’an is rich with InfoMiracles. It is shown that the existence of less than 30 InfoMiracles in the Qur’an is sufficient proof to the existence of God and that the Qur’an is a revelation from God.Keywords: InfoMiracle, God, mathematical proof, miracle, probability
Procedia PDF Downloads 2182419 Sand Production Modelled with Darcy Fluid Flow Using Discrete Element Method
Authors: M. N. Nwodo, Y. P. Cheng, N. H. Minh
Abstract:
In the process of recovering oil in weak sandstone formations, the strength of sandstones around the wellbore is weakened due to the increase of effective stress/load from the completion activities around the cavity. The weakened and de-bonded sandstone may be eroded away by the produced fluid, which is termed sand production. It is one of the major trending subjects in the petroleum industry because of its significant negative impacts, as well as some observed positive impacts. For efficient sand management therefore, there has been need for a reliable study tool to understand the mechanism of sanding. One method of studying sand production is the use of the widely recognized Discrete Element Method (DEM), Particle Flow Code (PFC3D) which represents sands as granular individual elements bonded together at contact points. However, there is limited knowledge of the particle-scale behavior of the weak sandstone, and the parameters that affect sanding. This paper aims to investigate the reliability of using PFC3D and a simple Darcy flow in understanding the sand production behavior of a weak sandstone. An isotropic tri-axial test on a weak oil sandstone sample was first simulated at a confining stress of 1MPa to calibrate and validate the parallel bond models of PFC3D using a 10m height and 10m diameter solid cylindrical model. The effect of the confining stress on the number of bonds failure was studied using this cylindrical model. With the calibrated data and sample material properties obtained from the tri-axial test, simulations without and with fluid flow were carried out to check on the effect of Darcy flow on bonds failure using the same model geometry. The fluid flow network comprised of every four particles connected with tetrahedral flow pipes with a central pore or flow domain. Parametric studies included the effects of confining stress, and fluid pressure; as well as validating flow rate – permeability relationship to verify Darcy’s fluid flow law. The effect of model size scaling on sanding was also investigated using 4m height, 2m diameter model. The parallel bond model successfully calibrated the sample’s strength of 4.4MPa, showing a sharp peak strength before strain-softening, similar to the behavior of real cemented sandstones. There seems to be an exponential increasing relationship for the bigger model, but a curvilinear shape for the smaller model. The presence of the Darcy flow induced tensile forces and increased the number of broken bonds. For the parametric studies, flow rate has a linear relationship with permeability at constant pressure head. The higher the fluid flow pressure, the higher the number of broken bonds/sanding. The DEM PFC3D is a promising tool to studying the micromechanical behavior of cemented sandstones.Keywords: discrete element method, fluid flow, parametric study, sand production/bonds failure
Procedia PDF Downloads 3232418 Cross-Country Mitigation Policies and Cross Border Emission Taxes
Authors: Massimo Ferrari, Maria Sole Pagliari
Abstract:
Pollution is a classic example of economic externality: agents who produce it do not face direct costs from emissions. Therefore, there are no direct economic incentives for reducing pollution. One way to address this market failure would be directly taxing emissions. However, because emissions are global, governments might as well find it optimal to wait let foreign countries to tax emissions so that they can enjoy the benefits of lower pollution without facing its direct costs. In this paper, we first document the empirical relation between pollution and economic output with static and dynamic regression methods. We show that there is a negative relation between aggregate output and the stock of pollution (measured as the stock of CO₂ emissions). This relationship is also highly non-linear, increasing at an exponential rate. In the second part of the paper, we develop and estimate a two-country, two-sector model for the US and the euro area. With this model, we aim at analyzing how the public sector should respond to higher emissions and what are the direct costs that these policies might have. In the model, there are two types of firms, brown firms (which produce a polluting technology) and green firms. Brown firms also produce an externality, CO₂ emissions, which has detrimental effects on aggregate output. As brown firms do not face direct costs from polluting, they do not have incentives to reduce emissions. Notably, emissions in our model are global: the stock of CO₂ in the economy affects all countries, independently from where it is produced. This simplified economy captures the main trade-off between emissions and production, generating a classic market failure. According to our results, the current level of emission reduces output by between 0.4 and 0.75%. Notably, these estimates lay in the upper bound of the distribution of those delivered by studies in the early 2000s. To address market failure, governments should step in introducing taxes on emissions. With the tax, brown firms pay a cost for polluting hence facing the incentive to move to green technologies. Governments, however, might also adopt a beggar-thy-neighbour strategy. Reducing emissions is costly, as moves production away from the 'optimal' production mix of brown and green technology. Because emissions are global, a government could just wait for the other country to tackle climate change, ripping the benefits without facing any costs. We study how this strategic game unfolds and show three important results: first, cooperation is first-best optimal from a global prospective; second, countries face incentives to deviate from the cooperating equilibria; third, tariffs on imported brown goods (the only retaliation policy in case of deviation from the cooperation equilibrium) are ineffective because the exchange rate would move to compensate. We finally study monetary policy under when costs for climate change rise and show that the monetary authority should react stronger to deviations of inflation from its target.Keywords: climate change, general equilibrium, optimal taxation, monetary policy
Procedia PDF Downloads 1602417 Multi-Criteria Evolutionary Algorithm to Develop Efficient Schedules for Complex Maintenance Problems
Authors: Sven Tackenberg, Sönke Duckwitz, Andreas Petz, Christopher M. Schlick
Abstract:
This paper introduces an extension to the well-established Resource-Constrained Project Scheduling Problem (RCPSP) to apply it to complex maintenance problems. The problem is to assign technicians to a team which has to process several tasks with multi-level skill requirements during a work shift. Here, several alternative activities for a task allow both, the temporal shift of activities or the reallocation of technicians and tools. As a result, switches from one valid work process variant to another can be considered and may be selected by the developed evolutionary algorithm based on the present skill level of technicians or the available tools. An additional complication of the observed scheduling problem is that the locations of the construction sites are only temporarily accessible during a day. Due to intensive rail traffic, the available time slots for maintenance and repair works are extremely short and are often distributed throughout the day. To identify efficient working periods, a first concept of a Bayesian network is introduced and is integrated into the extended RCPSP with pre-emptive and non-pre-emptive tasks. Thereby, the Bayesian network is used to calculate the probability of a maintenance task to be processed during a specific period of the shift. Focusing on the domain of maintenance of the railway infrastructure in metropolitan areas as the most unproductive implementation process at construction site, the paper illustrates how the extended RCPSP can be applied for maintenance planning support. A multi-criteria evolutionary algorithm with a problem representation is introduced which is capable of revising technician-task allocations, whereas the duration of the task may be stochastic. The approach uses a novel activity list representation to ensure easily describable and modifiable elements which can be converted into detailed shift schedules. Thereby, the main objective is to develop a shift plan which maximizes the utilization of each technician due to a minimization of the waiting times caused by rail traffic. The results of the already implemented core algorithm illustrate a fast convergence towards an optimal team composition for a shift, an efficient sequence of tasks and a high probability of the subsequent implementation due to the stochastic durations of the tasks. In the paper, the algorithm for the extended RCPSP is analyzed in experimental evaluation using real-world example problems with various size, resource complexity, tightness and so forth.Keywords: maintenance management, scheduling, resource constrained project scheduling problem, genetic algorithms
Procedia PDF Downloads 2312416 ML-Based Blind Frequency Offset Estimation Schemes for OFDM Systems in Non-Gaussian Noise Environments
Authors: Keunhong Chae, Seokho Yoon
Abstract:
This paper proposes frequency offset (FO) estimation schemes robust to the non-Gaussian noise for orthogonal frequency division multiplexing (OFDM) systems. A maximum-likelihood (ML) scheme and a low-complexity estimation scheme are proposed by applying the probability density function of the cyclic prefix of OFDM symbols to the ML criterion. From simulation results, it is confirmed that the proposed schemes offer a significant FO estimation performance improvement over the conventional estimation scheme in non-Gaussian noise environments.Keywords: frequency offset, cyclic prefix, maximum-likelihood, non-Gaussian noise, OFDM
Procedia PDF Downloads 4762415 Photoactivated Chromophore for Keratitis-Cross Linking Window Absorption Alone versus Combined Pack-CXL Window Absorption and Standard Anti-microbial Therapy for Treatment of Infectious Keratitis: A Prospective Study
Authors: Mohammed M. Mahdy Tawfeek
Abstract:
Objective: The aim of this work is to compare the outcome of photoactivated chromophore for keratitis-cross linking (PACK-CXL) window absorption (WA) alone with combined PACK-CXL WA and standard anti-microbial therapy (SAT) for the treatment of infectious keratitis. Patients and Methods: This is a randomized prospective comparative clinical trial. Thirty eyes with clinically suspected infectious keratitis were randomly assigned into two equal groups of 15 eyes each: Group (A) was treated by PACK-CXL WA alone and group (B) was treated by PACK-CXL WA combined with SAT. Identification of organisms was made by lab study before treatment. Corneal healing was evaluated by corneal examination and anterior segment OCT (AS-OCT). Written informed consent was obtained from all participants and the study was approved by the research ethics committee of the Faculty of Medicine, Zagazig University. The work has been carried out in accordance with The Code of Ethics of the World Medical Association (Declaration of Helsinki) for studies involving humans. Results: Complete healing and resolution (Successful treatment) were observed in 10 eyes (66.7%) of a group (A) and 14 eyes (93.3%) of group (B) and failure was observed in 5 eyes (33.3%) of a group (A) and one eye (6.67%) of group (B). They were statistically significant (P =0.042 and 0.003) in a comparison between both groups regarding success and failure of treatment, respectively. Complete corneal healing was reported in the third month postoperatively in 10 eyes (66.7%) of group (A) and 14 eyes (93.3%) of group (B). Complications were absent in 12 patients (80%) in group (A) and 14 patients (93.3%) of group (B); however, perforation and impending perforation were found in 3 patients of group (A) and only one patient of group (B). Conclusion: PACK-CXL is a promising, non-invasive treatment option for infectious keratitis, especially when performed with the window absorption (WA) technique, either alone or combined with SAT. It has a synergistic effect with a standard antimicrobial treatment that gives good outcome results in the treatment of infectious keratitis. Also, it avoids the antibiotics resistance that has become rapidly spreading worldwide.Keywords: corneal cross linking, infectious keratitis, PACK-CXL, window absorption
Procedia PDF Downloads 1402414 Influence of Climate Change on Landslides in Northeast India: A Case Study
Authors: G. Vishnu, T. V. Bharat
Abstract:
Rainfall plays a major role in the stability of natural slopes in tropical and subtropical regions. These slopes usually have high slope angles and are stable during the dry season. The critical rainfall intensity that might trigger a landslide may not be the highest rainfall. In addition to geological discontinuities and anthropogenic factors, water content, suction, and hydraulic conductivity also play a role. A thorough geotechnical investigation with the principles of unsaturated soil mechanics is required to predict the failures in these cases. The study discusses three landslide events that had occurred in residual hills of Guwahati, India. Rainfall data analysis, history image analysis, land use, and slope maps of the region were analyzed and discussed. The landslide occurred on June (24, 26, and 28) 2020, on the respective sites, but the highest rainfall was on June (6 and 17) 2020. The factors that lead to the landslide occurrence is the combination of critical events initiated with rainfall, causing a reduction in suction. The sites consist of a mixture of rocks and soil. The slope failure occurs due to the saturation of the soil layer leading to loss of soil strength resulting in the flow of the entire soil rock mass. The land-use change, construction activities, other human and natural activities that lead to faster disintegration of rock mass may accelerate the landslide events. Landslides in these slopes are inevitable, and the development of an early warning system (EWS) to save human lives and resources is a feasible way. The actual time of failure of a slope can be better predicted by considering all these factors rather than depending solely on the rainfall intensities. An effective EWS is required with less false alarms in these regions by proper instrumentation of slope and appropriate climatic downscaling.Keywords: early warning system, historic image analysis, slope instrumentation, unsaturated soil mechanics
Procedia PDF Downloads 1142413 Monotonicity of the Jensen Functional for f-Divergences via the Zipf-Mandelbrot Law
Authors: Neda Lovričević, Đilda Pečarić, Josip Pečarić
Abstract:
The Jensen functional in its discrete form is brought in relation to the Csiszar divergence functional, this time via its monotonicity property. This approach presents a generalization of the previously obtained results that made use of interpolating Jensen-type inequalities. Thus the monotonicity property is integrated with the Zipf-Mandelbrot law and applied to f-divergences for probability distributions that originate from the Csiszar divergence functional: Kullback-Leibler divergence, Hellinger distance, Bhattacharyya distance, chi-square divergence, total variation distance. The Zipf-Mandelbrot and the Zipf law are widely used in various scientific fields and interdisciplinary and here the focus is on the aspect of the mathematical inequalities.Keywords: Jensen functional, monotonicity, Csiszar divergence functional, f-divergences, Zipf-Mandelbrot law
Procedia PDF Downloads 1422412 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms
Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee
Abstract:
Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences
Procedia PDF Downloads 2742411 Pentax Airway Scope Video Laryngoscope for Orotracheal Intubation in Children: A Randomized Controlled Trial
Authors: In Kyong Yi, Yun Jeong Chae, Jihoon Hwang, Sook-Young Lee, Jong-Yeop Kim
Abstract:
Background: Pentax airway scope (AWS) is a recently developed video laryngoscope for use in both normal and difficult airways, providing a good laryngeal view. The purpose of this randomized noninferior study was to evaluate the efficacy of the Pentax-AWS regarding intubation time, laryngeal view and ease of intubation in pediatric patients with normal airway, compared to Macintosh laryngoscope. Method: A total of 136 pediatric patients aged 1 to 10 with American Society of Anesthesiologists physical status I or II undergoing general anesthesia required orotracheal intubation were randomly allocated into two groups: Macintosh laryngoscope (n =68) and Pentax AWS (n=68). Anesthesia was induced with propofol, rocuronium, and sevoflurane. The primary outcome was intubation time. Cormack-Lehane laryngeal view grade, application of optimal laryngeal external manipulation (OELM), intubation difficulty scale (IDS), intubation failure rate and adverse events were also measured. Result: No significant difference was observed between the two groups regarding intubation time (Macintosh; 23[22-26] sec vs. Pentax; 23.5[22-27.75] sec, p=0.713). As for the laryngeal view grade, the Pentax group showed less number of grade 2a or higher grade cases compared to the Macintosh group (1/2a/2b/3; 52.9%/41.2%/4.4%/1.5% vs. 98.5%/1.5%/0%/0%, p=0.000). No optimal laryngeal external manipulation application was required in the Pentax group (38.2% vs. 0%, p=0.000). Intubation difficulty scale resulted in lower values for Pentax group (0 [0-2] vs. 0 [0-0.55], p=0.001). Failure rate was not different between the two groups (1.5% vs. 4.4%, p=0.619). Adverse event-wise, slightly higher incidence of bleeding (1.5% vs. 5.9%, p=0.172) and teeth injury (0% vs. 5.9%, p=0.042) occurred in the Pentax group. Conclusion: In conclusion, Pentax-AWS provided better laryngeal view, similar intubation time and similar success rate compared with Macintosh laryngoscope in children with normal airway. However, the risk of teeth injury might increase and warrant special attention.Keywords: Pentax-AWS, pediatric, video laryngoscope, intubation
Procedia PDF Downloads 2022410 Evaluating Daylight Performance in an Office Environment in Malaysia, Using Venetian Blind System: Case Study
Authors: Fatemeh Deldarabdolmaleki, Mohamad Fakri Zaky Bin Ja'afar
Abstract:
Having a daylit space together with view results in a pleasant and productive environment for office employees. A daylit space is a space which utilizes daylight as a basic source of illumination to fulfill user’s visual demands and minimizes the electric energy consumption. Malaysian weather is hot and humid all over the year because of its location in the equatorial belt. however, because most of the commercial buildings in Malaysia are air-conditioned, huge glass windows are normally installed in order to keep the physical and visual relation between inside and outside. As a result of climatic situation and mentioned new trend, an ordinary office has huge heat gain, glare, and discomfort for occupants. Balancing occupant’s comfort and energy conservation in a tropical climate is a real challenge. This study concentrates on evaluating a venetian blind system using per pixel analyzing tools based on the suggested cut-out metrics by the literature. Workplace area in a private office room has been selected as a case study. Eight-day measurement experiment was conducted to investigate the effect of different venetian blind angles in an office area under daylight conditions in Serdang, Malaysia. The study goal was to explore daylight comfort of a commercially available venetian blind system, its’ daylight sufficiency and excess (8:00 AM to 5 PM) as well as Glare examination. Recently developed software, analyzing High Dynamic Range Images (HDRI captured by CCD camera), such as radiance based Evalglare and hdrscope help to investigate luminance-based metrics. The main key factors are illuminance and luminance levels, mean and maximum luminance, daylight glare probability (DGP) and luminance ratio of the selected mask regions. The findings show that in most cases, morning session needs artificial lighting in order to achieve daylight comfort. However, in some conditions (e.g. 10° and 40° slat angles) in the second half of day the workplane illuminance level exceeds the maximum of 2000 lx. Generally, a rising trend is discovered toward mean window luminance and the most unpleasant cases occur after 2 P.M. Considering the luminance criteria rating, the uncomfortable conditions occur in the afternoon session. Surprisingly in no blind condition, extreme case of window/task ratio is not common. Studying the daylight glare probability, there is not any DGP value higher than 0.35 in this experiment.Keywords: daylighting, energy simulation, office environment, Venetian blind
Procedia PDF Downloads 2592409 The Lopsided Burden of Non-Communicable Diseases in India: Evidences from the Decade 2004-2014
Authors: Kajori Banerjee, Laxmi Kant Dwivedi
Abstract:
India is a part of the ongoing globalization, contemporary convergence, industrialization and technical advancement that is taking place world-wide. Some of the manifestations of this evolution is rapid demographic, socio-economic, epidemiological and health transition. There has been a considerable increase in non-communicable diseases due to change in lifestyle. This study aims to assess the direction of burden of disease and compare the pressure of infectious diseases against cardio-vascular, endocrine, metabolic and nutritional diseases. The change in prevalence in a ten-year period (2004-2014) is further decomposed to determine the net contribution of various socio-economic and demographic covariates. The present study uses the recent 71st (2014) and 60th (2004) rounds of National Sample Survey. The pressure of infectious diseases against cardio-vascular (CVD), endocrine, metabolic and nutritional (EMN) diseases during 2004-2014 is calculated by Prevalence Rates (PR), Hospitalization Rates (HR) and Case Fatality Rates (CFR). The prevalence of non-communicable diseases are further used as a dependent variable in a logit regression to find the effect of various social, economic and demographic factors on the chances of suffering from the particular disease. Multivariate decomposition technique further assists in determining the net contribution of socio-economic and demographic covariates. This paper upholds evidences of stagnation of the burden of communicable diseases (CD) and rapid increase in the burden of non-communicable diseases (NCD) uniformly for all population sub-groups in India. CFR for CVD has increased drastically in 2004-2014. Logit regression indicates the chances of suffering from CVD and EMN is significantly higher among the urban residents, older ages, females, widowed/ divorced and separated individuals. Decomposition displays ample proof that improvement in quality of life markers like education, urbanization, longevity of life has positively contributed in increasing the NCD prevalence rate. In India’s current epidemiological phase, compression theory of morbidity is in action as a significant rise in the probability of contracting the NCDs over the time period among older ages is observed. Age is found to play a vital contributor in increasing the probability of having CVD and EMN over the study decade 2004-2014 in the nationally representative sample of National Sample Survey.Keywords: cardio-vascular disease, case-fatality rate, communicable diseases, hospitalization rate, multivariate decomposition, non-communicable diseases, prevalence rate
Procedia PDF Downloads 3132408 Modeling Search-And-Rescue Operations by Autonomous Mobile Robots at Sea
Authors: B. Kriheli, E. Levner, T. C. E. Cheng, C. T. Ng
Abstract:
During the last decades, research interest in planning, scheduling, and control of emergency response operations, especially people rescue and evacuation from the dangerous zone of marine accidents, has increased dramatically. Until the survivors (called ‘targets’) are found and saved, it may cause loss or damage whose extent depends on the location of the targets and the search duration. The problem is to efficiently search for and detect/rescue the targets as soon as possible with the help of intelligent mobile robots so as to maximize the number of saved people and/or minimize the search cost under restrictions on the amount of saved people within the allowable response time. We consider a special situation when the autonomous mobile robots (AMR), e.g., unmanned aerial vehicles and remote-controlled robo-ships have no operator on board as they are guided and completely controlled by on-board sensors and computer programs. We construct a mathematical model for the search process in an uncertain environment and provide a new fast algorithm for scheduling the activities of the autonomous robots during the search-and rescue missions after an accident at sea. We presume that in the unknown environments, the AMR’s search-and-rescue activity is subject to two types of error: (i) a 'false-negative' detection error where a target object is not discovered (‘overlooked') by the AMR’s sensors in spite that the AMR is in a close neighborhood of the latter and (ii) a 'false-positive' detection error, also known as ‘a false alarm’, in which a clean place or area is wrongly classified by the AMR’s sensors as a correct target. As the general resource-constrained discrete search problem is NP-hard, we restrict our study to finding local-optimal strategies. A specificity of the considered operational research problem in comparison with the traditional Kadane-De Groot-Stone search models is that in our model the probability of the successful search outcome depends not only on cost/time/probability parameters assigned to each individual location but, as well, on parameters characterizing the entire history of (unsuccessful) search before selecting any next location. We provide a fast approximation algorithm for finding the AMR route adopting a greedy search strategy in which, in each step, the on-board computer computes a current search effectiveness value for each location in the zone and sequentially searches for a location with the highest search effectiveness value. Extensive experiments with random and real-life data provide strong evidence in favor of the suggested operations research model and corresponding algorithm.Keywords: disaster management, intelligent robots, scheduling algorithm, search-and-rescue at sea
Procedia PDF Downloads 1722407 Multi-Level Security Measures in Cloud Computing
Authors: Shobha G. Ranjan
Abstract:
Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.Keywords: cloud computing, cloud security, integrity, multi-tenancy, security
Procedia PDF Downloads 5012406 Effects of Polyvictimization in Suicidal Ideation among Children and Adolescents in Chile
Authors: Oscar E. Cariceo
Abstract:
In Chile, there is a lack of evidence about the impact of polyvictimization on the emergence of suicidal thoughts among children and young people. Thus, this study aims to explore the association between the episodes of polyvictimization suffered by Chilean children and young people and the manifestation of signs related to suicidal tendencies. To achieve this purpose, secondary data from the First Polyvictimization Survey on Children and Adolescents of 2017 were analyzed, and a binomial logistic regression model was applied to establish the probability that young people are experiencing suicidal ideation episodes. The main findings show that women between the ages of 13 and 15 years, who are in seventh grade and second in subsidized schools, are more likely to express suicidal ideas, which increases if they have suffered different types of victimization, particularly physical violence, psychological aggression, and sexual abuse.Keywords: Chile, polyvictimization, suicidal ideation, youth
Procedia PDF Downloads 1782405 Efficacy of In-Situ Surgical vs. Needle Revision on Late Failed Trabeculectomy Blebs
Authors: Xie Xiaobin, Zhang Yan, Shi Yipeng, Sun Wenying, Chen Shuang, Cai Zhipeng, Zhang Hong, Zhang Lixia, Xie Like
Abstract:
Objective: The objective of this research is to compare the efficacy of the late in-situ surgical revision augmented with continuous infusion and needle revision on failed trabeculectomy blebs. Methods From December 2018 to December 2021, a prospective randomized controlled trial was performed on 44 glaucoma patients with failed bleb ≥ 6months with medically uncontrolled in Eye Hospital, China Academy of Chinese Medical Sciences. They were randomly divided into two groups. 22 eyes of 22 patients underwent the late in-situ surgical revision with continuous anterior chamber infusion in the study group, and 22 of 22 patients were treated with needle revision in the control group. Main outcome measures include preoperative and postoperative intraocular pressure (IOP), the number of anti-glaucoma medicines, the operation success rate, and the postoperative complications. Results The postoperative IOP values decreased significantly from the baseline in both groups (both P<0.05). IOP was significantly lower in the study group than in the control group at one week, 1, and 3 months postoperatively (all P<0.05). IOP reductions in the study group were substantially more prominent than in the control group at all postoperative time points (all P<0.05). The complete success rate in the study group was significantly higher than in the control group (71.4% vs. 33.3%, P<0.05), while the complete failure rate was significantly lower in the study group (0% vs. 28.5%, P<0.05). According to Cox’s proportional hazards regression analysis, high IOP at baseline was independently associated with increased risks of complete failure (adjusted hazard ratio=1.141, 95% confidence interval=1.021-1.276, P<0.05). There was no significant difference in the incidence of postoperative complications between the two groups (P>0.05). Conclusion: Both in-situ surgical and needle revision have acceptable success rates and safety for the late failed trabeculectomy blebs, while the former is likely to have a higher level of efficacy over the latter. Needle revision may be insufficient for eyes with low target IOP.Keywords: glaucoma, trabeculectomy blebs, in-situ surgical revision, needle revision
Procedia PDF Downloads 842404 The Impact of the Great Irish Famine on Irish Mass Migration to the United States at the Turn of the Twentieth Century
Authors: Gayane Vardanyan, Gaia Narciso, Battista Severgnini
Abstract:
This paper investigates the long-run impact of the Great Irish Famine on emigration from Ireland at the turn of the twentieth century. To do it we combine the 1901 and the 1911 Irish Census data sets with the Ellis Island Administrative Records on Irish migrants to the United States. We find that the migrants were more likely to be Catholic, literate, unmarried, young and Gaelic speaking compared to the ones that stay. Running individual level specifications, our preliminary findings suggest that being born in a place where the Famine was more severe increases the probability of becoming a migrant in the long-run. We also intend to explore the mechanisms through which this impact occurs.Keywords: Great Famine, mass migration, long-run impact, mechanisms
Procedia PDF Downloads 2382403 Trip Reduction in Turbo Machinery
Authors: Pranay Mathur, Carlo Michelassi, Simi Karatha, Gilda Pedoto
Abstract:
Industrial plant uptime is top most importance for reliable, profitable & sustainable operation. Trip and failed start has major impact on plant reliability and all plant operators focussed on efforts required to minimise the trips & failed starts. The performance of these CTQs are measured with 2 metrics, MTBT(Mean time between trips) and SR (Starting reliability). These metrics helps to identify top failure modes and identify units need more effort to improve plant reliability. Baker Hughes Trip reduction program structured to reduce these unwanted trip 1. Real time machine operational parameters remotely available and capturing the signature of malfunction including related boundary condition. 2. Real time alerting system based on analytics available remotely. 3. Remote access to trip logs and alarms from control system to identify the cause of events. 4. Continuous support to field engineers by remotely connecting with subject matter expert. 5. Live tracking of key CTQs 6. Benchmark against fleet 7. Break down to the cause of failure to component level 8. Investigate top contributor, identify design and operational root cause 9. Implement corrective and preventive action 10. Assessing effectiveness of implemented solution using reliability growth models. 11. Develop analytics for predictive maintenance With this approach , Baker Hughes team is able to support customer in achieving their Reliability Key performance Indicators for monitored units, huge cost savings for plant operators. This Presentation explains these approach while providing successful case studies, in particular where 12nos. of LNG and Pipeline operators with about 140 gas compressing line-ups has adopted these techniques and significantly reduce the number of trips and improved MTBTKeywords: reliability, availability, sustainability, digital infrastructure, weibull, effectiveness, automation, trips, fail start
Procedia PDF Downloads 762402 Optimal Diversification and Bank Value Maximization
Authors: Chien-Chih Lin
Abstract:
This study argues that the optimal diversifications for the maximization of bank value are asymmetrical; they depend on the business cycle. During times of expansion, systematic risks are relatively low, and hence there is only a slight effect from raising them with a diversified portfolio. Consequently, the benefit of reducing individual risks dominates any loss from raising systematic risks, leading to a higher value for a bank by holding a diversified portfolio of assets. On the contrary, in times of recession, systematic risks are relatively high. It is more likely that the loss from raising systematic risks surpasses the benefit of reducing individual risks from portfolio diversification. Consequently, more diversification leads to lower bank values. Finally, some empirical evidence from the banks in Taiwan is provided.Keywords: diversification, default probability, systemic risk, banking, business cycle
Procedia PDF Downloads 4372401 A New Approach to Retrofit Steel Moment Resisting Frame Structures after Mainshock
Authors: Amir H. Farivarrad, Kiarash M. Dolatshahi
Abstract:
During earthquake events, aftershocks can significantly increase the probability of collapse of buildings, especially for those with induced damages during the mainshock. In this paper, a practical approach is proposed for seismic rehabilitation of mainshock-damaged buildings that can be easily implemented within few days after the mainshock. To show the efficacy of the proposed method, a case study nine story steel moment frame building is chosen which was designed to pre-Northridge codes. The collapse fragility curve for the aftershock is presented for both the retrofitted and non-retrofitted structures. Comparison of the collapse fragility curves shows that the proposed method is indeed applicable to reduce the seismic collapse risk.Keywords: aftershock, the collapse fragility curve, seismic rehabilitation, seismic retrofitting
Procedia PDF Downloads 4332400 A Parametric Study on Effects of Internal Factors on Carbonation of Reinforced Concrete
Authors: Kunal Tongaria, Abhishek Mangal, S. Mandal, Devendra Mohan
Abstract:
The carbonation of concrete is a phenomenon which is a function of various interdependent parameters. Therefore, in spite of numerous literature and database, the useful generalization is not an easy task. These interdependent parameters can be grouped under the category of internal and external factors. This paper focuses on the internal parameters which govern and increase the probability of the ingress of deleterious substances into concrete. The mechanism of effects of internal parameters such as microstructure for with and without supplementary cementing materials (SCM), water/binder ratio, the age of concrete etc. has been discussed. This is followed by the comparison of various proposed mathematical models for the deterioration of concrete. Based on existing laboratory experiments as well as field results, this paper concludes the present understanding of mechanism, modeling and future research needs in this field.Keywords: carbonation, diffusion coefficient, microstructure of concrete, reinforced concrete
Procedia PDF Downloads 4092399 Engineering Properties of Different Lithological Varieties of a Singapore Granite
Authors: Louis Ngai Yuen Wong, Varun Maruvanchery
Abstract:
The Bukit Timah Granite, which is a major rock formation in Singapore, encompasses different rock types such as granite, adamellite, and granodiorite with various hybrid rocks. The present study focuses on the Central Singapore Granite found in the Mandai area. Even within this small aerial extent, lithological variations with respect to the composition, texture as well as the grain size have been recognized in this igneous body. Over the years, the research effort on the Bukit Timah Granite has been focused on achieving a better understanding of its engineering properties in association with civil engineering projects. To our best understanding, a few types of research attempted to systematically investigate the influence of grain size, mineral composition, texture etc. on the strength of Bukit Timah Granite rocks in a comprehensive manner. In typical local industry practices, the different lithological varieties are not differentiated, but all are grouped under Bukit Timah Granite during core logging and the subsequent determination of engineering properties. To address such a major gap in the local engineering geological practice, a preliminary study is conducted on the variations of uniaxial compressive strength (UCS) in seven distinctly different lithological varieties found in the Bukit Timah Granite. Other physical properties including Young’s modulus, P-wave velocity and dry density determined from laboratory testing will also be discussed. The study is supplemented by a petrographical thin section examination. In addition, the specimen failure mode is classified and further correlated with the lithological varieties by carefully observing the details of crack initiation, propagation and coalescence processes in the specimens undergoing loading tests using a high-speed camera. The outcome of this research, which is the first of its type in Singapore, will have a direct implication on the sampling and design practices in the field of civil engineering and particularly underground space development in Singapore.Keywords: Bukit Timah Granite, lithological variety, thin section study, high speed video, failure mode
Procedia PDF Downloads 3222398 Seismic Analysis of Vertical Expansion Hybrid Structure by Response Spectrum Method Concern with Disaster Management and Solving the Problems of Urbanization
Authors: Gautam, Gurcharan Singh, Mandeep Kaur, Yogesh Aggarwal, Sanjeev Naval
Abstract:
The present ground reality scenario of suffering of humanity shows the evidence of failure to take wrong decisions to shape the civilization with Irresponsibilities in the history. A strong positive will of right responsibilities make the right civilization structure which affects itself and the whole world. Present suffering of humanity shows and reflect the failure of past decisions taken to shape the true culture with right social structure of society, due to unplanned system of Indian civilization and its rapid disaster of population make the failure to face all kind of problems which make the society sufferer. Our India is still suffering from disaster like earthquake, floods, droughts, tsunamis etc. and we face the uncountable disaster of deaths from the beginning of humanity at the present time. In this research paper our focus is to make a Disaster Resistance Structure having the solution of dense populated urban cities area by high vertical expansion HYBRID STRUCTURE. Our efforts are to analyse the Reinforced Concrete Hybrid Structure at different seismic zones, these concrete frames were analyzed using the response spectrum method to calculate and compare the different seismic displacement and drift. Seismic analysis by this method generally is based on dynamic analysis of building. Analysis results shows that the Reinforced Concrete Building at seismic Zone V having maximum peak story shear, base shear, drift and node displacement as compare to the analytical results of Reinforced Concrete Building at seismic Zone III and Zone IV. This analysis results indicating to focus on structural drawings strictly at construction site to make a HYBRID STRUCTURE. The study case is deal with the 10 story height of a vertical expansion Hybrid frame structure at different zones i.e. zone III, zone IV and zone V having the column 0.45x0.36mt and beam 0.6x0.36mt. with total height of 30mt, to make the structure more stable bracing techniques shell be applied like mage bracing and V shape bracing. If this kind of efforts or structure drawings are followed by the builders and contractors then we save the lives during earthquake disaster at Bhuj (Gujarat State, India) on 26th January, 2001 which resulted in more than 19,000 deaths. This kind of Disaster Resistance Structure having the capabilities to solve the problems of densely populated area of cities by the utilization of area in vertical expansion hybrid structure. We request to Government of India to make new plans and implementing it to save the lives from future disasters instead of unnecessary wants of development plans like Bullet Trains.Keywords: history, irresponsibilities, unplanned social structure, humanity, hybrid structure, response spectrum analysis, DRIFT, and NODE displacement
Procedia PDF Downloads 2112397 Outcomes of the Gastrocnemius Flap Performed by Orthopaedic Surgeons in Salvage Revision Knee Arthroplasty: A Retrospective Study at a Tertiary Orthopaedic Centre
Authors: Amirul Adlan, Robert McCulloch, Scott Evans, Michael Parry, Jonathan Stevenson, Lee Jeys
Abstract:
Background and Objectives: The gastrocnemius myofascial flap is used to manage soft-tissue defects over the anterior aspect of the knee in the context of a patient presenting with a sinus and periprosthetic joint infection (PJI) or extensor mechanism failure. The aim of this study was twofold: firstly, to evaluate the outcomes of gastrocnemius flaps performed by appropriately trained orthopaedic surgeons in the context of PJI and, secondly, to evaluate the infection-free survival of this patient group. Methods: We retrospectively reviewed 30 patients who underwent gastrocnemius flap reconstruction during staged revision total knee arthroplasty for prosthetic joint infection (PJI). All flaps were performed by an orthopaedic surgeon with orthoplastics training. Patients had a mean age of 68.9 years (range 50–84) and were followed up for a mean of 50.4 months (range 2–128 months). A total of 29 patients (97 %) were categorized into Musculoskeletal Infection Society (MSIS) local extremity grade 3 (greater than two compromising factors), and 52 % of PJIs were polymicrobial. The primary outcome measure was flap failure, and the secondary outcome measure was a recurrent infection. Results: Flap survival was 100% with no failures or early returns to theatre for flap problems such as necrosis or haematoma. Overall infection-free survival during the study period was 48% (13 of 27 infected cases). Using limb salvage as the outcome, 77% (23 of 30 patients) retained the limb. Infection recurrence occurred in 48% (10 patients) in the type B3 cohort and 67% (4 patients) in the type C3 cohort (p = 0.65). Conclusion: The surgical technique for a gastrocnemius myofascial flap is reliable and reproducible when performed by appropriately trained orthopaedic surgeons, even in high-risk groups. However, the risks of recurrent infection and amputation remain high within our series due to poor host and extremity factors.Keywords: gastrocnemius flap, limb salvage, revision arthroplasty, outcomes
Procedia PDF Downloads 1112396 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects
Authors: Behnam Tavakkol
Abstract:
Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data
Procedia PDF Downloads 2152395 Comparison of the Effectiveness of Tree Algorithms in Classification of Spongy Tissue Texture
Authors: Roza Dzierzak, Waldemar Wojcik, Piotr Kacejko
Abstract:
Analysis of the texture of medical images consists of determining the parameters and characteristics of the examined tissue. The main goal is to assign the analyzed area to one of two basic groups: as a healthy tissue or a tissue with pathological changes. The CT images of the thoracic lumbar spine from 15 healthy patients and 15 with confirmed osteoporosis were used for the analysis. As a result, 120 samples with dimensions of 50x50 pixels were obtained. The set of features has been obtained based on the histogram, gradient, run-length matrix, co-occurrence matrix, autoregressive model, and Haar wavelet. As a result of the image analysis, 290 descriptors of textural features were obtained. The dimension of the space of features was reduced by the use of three selection methods: Fisher coefficient (FC), mutual information (MI), minimization of the classification error probability and average correlation coefficients between the chosen features minimization of classification error probability (POE) and average correlation coefficients (ACC). Each of them returned ten features occupying the initial place in the ranking devised according to its own coefficient. As a result of the Fisher coefficient and mutual information selections, the same features arranged in a different order were obtained. In both rankings, the 50% percentile (Perc.50%) was found in the first place. The next selected features come from the co-occurrence matrix. The sets of features selected in the selection process were evaluated using six classification tree methods. These were: decision stump (DS), Hoeffding tree (HT), logistic model trees (LMT), random forest (RF), random tree (RT) and reduced error pruning tree (REPT). In order to assess the accuracy of classifiers, the following parameters were used: overall classification accuracy (ACC), true positive rate (TPR, classification sensitivity), true negative rate (TNR, classification specificity), positive predictive value (PPV) and negative predictive value (NPV). Taking into account the classification results, it should be stated that the best results were obtained for the Hoeffding tree and logistic model trees classifiers, using the set of features selected by the POE + ACC method. In the case of the Hoeffding tree classifier, the highest values of three parameters were obtained: ACC = 90%, TPR = 93.3% and PPV = 93.3%. Additionally, the values of the other two parameters, i.e., TNR = 86.7% and NPV = 86.6% were close to the maximum values obtained for the LMT classifier. In the case of logistic model trees classifier, the same ACC value was obtained ACC=90% and the highest values for TNR=88.3% and NPV= 88.3%. The values of the other two parameters remained at a level close to the highest TPR = 91.7% and PPV = 91.6%. The results obtained in the experiment show that the use of classification trees is an effective method of classification of texture features. This allows identifying the conditions of the spongy tissue for healthy cases and those with the porosis.Keywords: classification, feature selection, texture analysis, tree algorithms
Procedia PDF Downloads 1782394 A User Identification Technique to Access Big Data Using Cloud Services
Authors: A. R. Manu, V. K. Agrawal, K. N. Balasubramanya Murthy
Abstract:
Authentication is required in stored database systems so that only authorized users can access the data and related cloud infrastructures. This paper proposes an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. The proposed technique is likely to be more robust as the probability of breaking the password is extremely low. This framework uses a multi-modal biometric approach and SMS to enforce additional security measures with the conventional Login/password system. The robustness of the technique is demonstrated mathematically using a statistical analysis. This work presents the authentication system along with the user authentication architecture diagram, activity diagrams, data flow diagrams, sequence diagrams, and algorithms.Keywords: design, implementation algorithms, performance, biometric approach
Procedia PDF Downloads 4762393 A Markov Model for the Elderly Disability Transition and Related Factors in China
Authors: Huimin Liu, Li Xiang, Yue Liu, Jing Wang
Abstract:
Background: As one of typical case for the developing countries who are stepping into the aging times globally, more and more older people in China might face the problem of which they could not maintain normal life due to the functional disability. While the government take efforts to build long-term care system and further carry out related policies for the core concept, there is still lack of strong evidence to evaluating the profile of disability states in the elderly population and its transition rate. It has been proved that disability is a dynamic condition of the person rather than irreversible so it means possible to intervene timely on them who might be in a risk of severe disability. Objective: The aim of this study was to depict the picture of the disability transferring status of the older people in China, and then find out individual characteristics that change the state of disability to provide theory basis for disability prevention and early intervention among elderly people. Methods: Data for this study came from the 2011 baseline survey and the 2013 follow-up survey of the China Health and Retirement Longitudinal Study (CHARLS). Normal ADL function, 1~2 ADLs disability,3 or above ADLs disability and death were defined from state 1 to state 4. Multi-state Markov model was applied and the four-state homogeneous model with discrete states and discrete times from two visits follow-up data was constructed to explore factors for various progressive stages. We modeled the effect of explanatory variables on the rates of transition by using a proportional intensities model with covariate, such as gender. Result: In the total sample, state 2 constituent ratio is nearly about 17.0%, while state 3 proportion is blow the former, accounting for 8.5%. Moreover, ADL disability statistics difference is not obvious between two years. About half of the state 2 in 2011 improved to become normal in 2013 even though they get elder. However, state 3 transferred into the proportion of death increased obviously, closed to the proportion back to state 2 or normal functions. From the estimated intensities, we see the older people are eleven times as likely to develop at 1~2 ADLs disability than dying. After disability onset (state 2), progression to state 3 is 30% more likely than recovery. Once in state 3, a mean of 0.76 years is spent before death or recovery. In this model, a typical person in state 2 has a probability of 0.5 of disability-free one year from now while the moderate disabled or above has a probability of 0.14 being dead. Conclusion: On the long-term care cost considerations, preventive programs for delay the disability progression of the elderly could be adopted based on the current disabled state and main factors of each stage. And in general terms, those focusing elderly individuals who are moderate or above disabled should go first.Keywords: Markov model, elderly people, disability, transition intensity
Procedia PDF Downloads 2902392 Input Data Balancing in a Neural Network PM-10 Forecasting System
Authors: Suk-Hyun Yu, Heeyong Kwon
Abstract:
Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.Keywords: artificial intelligence, air quality prediction, neural networks, pattern recognition, PM-10
Procedia PDF Downloads 232