Search results for: packet loss probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4652

Search results for: packet loss probability

4502 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 232
4501 Parameter Interactions in the Cumulative Prospect Theory: Fitting the Binary Choice Experiment Data

Authors: Elzbieta Babula, Juhyun Park

Abstract:

Tversky and Kahneman’s cumulative prospect theory assumes symmetric probability cumulation with regard to the reference point within decision weights. Theoretically, this model should be invariant under the change of the direction of probability cumulation. In the present study, this phenomenon is being investigated by creating a reference model that allows verifying the parameter interactions in the cumulative prospect theory specifications. The simultaneous parametric fitting of utility and weighting functions is applied to binary choice data from the experiment. The results show that the flexibility of the probability weighting function is a crucial characteristic allowing to prevent parameter interactions while estimating cumulative prospect theory.

Keywords: binary choice experiment, cumulative prospect theory, decision weights, parameter interactions

Procedia PDF Downloads 212
4500 A Systematic Review with Meta-Analyses Investigating the Association between Binge Eating and Poor Weight Loss Outcomes in People with Obesity

Authors: Isabella Lobo Sasaoka, Felipe Q. da Luz, Zubeyir Salis, Phillipa Hay, Tamiris Gaeta, Paula Costa Teixeira, Táki Cordás, Amanda Sainsbury

Abstract:

Background: A significant number of people with obesity that seek weight loss treatments experience binge eating episodes. Nonetheless, it is unknown whether binge eating episodes can hinder weight loss outcomes. Objective: To compare weight change in people with or without binge eating submitted to bariatric surgery, pharmacotherapy, nutritional orientation, and/or psychological therapies. Method: We conducted a systematic review with meta-analyses by searching studies in PubMed, American Psychological Association (APA), and Embase. Results: Thirty-four studies were included in our systematic review, and 17 studies were included in the meta-analyses. Overall, we found no significant difference in weight loss between people with or without binge eating submitted to any type of weight loss treatment. Additionally, we found no statistically significant differences in body weight between people with or without binge eating at short and long follow-up assessments following any type of weight loss treatment. We also examined changes in body weight in people with or without binge eating in three additional meta-analyses categorized by the type of weight loss treatment (i.e., behavioural and/or nutritional interventions; bariatric surgery; pharmacotherapy isolated or combined with behavior interventions) and found no difference in weight loss. Eleven out of the 17 studies that were assessed qualitatively (i.e., not included in meta-analyses) did not show differences in weight loss in people with or without binge eating submitted to any type of weight loss treatment. Conclusion: This systematic review with meta-analyses showed no difference in weight loss in people with or without binge eating submitted to a variety of weight loss treatments. Nonetheless, specialized therapies can be required to address eating disorder psychopathology and recurrent binge eating in people with obesity that seek weight loss.

Keywords: obesity, binge eating, weight loss, systematic review, meta-analysis

Procedia PDF Downloads 150
4499 Prevalence and Patterns of Hearing Loss among the Elderly with Hypertension in Southwest, Nigeria

Authors: Ayo Osisanya, Promise Ebuka Okonkwo

Abstract:

Reduced hearing sensitivity among the elderly has been attributed to some risk factors and influence of age-related degenerative conditions such as diabetes, cardiovascular disease, Alzheimer’s disease, bipolar disorder, and hypertension. Hearing loss; especially the age-related type (presbycusis), has been reported as one of the global burden affecting the general well-being and quality of life of the elderly with hypertension. Thus, hearing loss has been observed to be associated with hypertension and functional decline in elderly, as this condition makes them experience poor communication, fatigue, reduced social functions, mood-swing, and withdrawal syndrome. Emerging research outcomes indicate a strong relationship between hypertension and reduced auditory performance among the elderly. Therefore, this study determined the prevalence, types, and patterns of hearing loss associated with hypertension, with a bid to suggesting comprehensive management strategies and a model of creating awareness towards promoting good healthy living among the elderly in Nigeria. One hundred and seventy-two elderly, aged 65–85 with hypertension were purposively selected from patients undergoing treatment for hypertension in some tertiary hospitals in southwest Nigeria for the study. Participants were suggested to Pure-Tone Audiometry (PTA) through the use of Maico 53 Diagnostic Audiometer to determine the degree, types ad patterns of hearing loss among the elderly with hypertension. Results showed that 148 (86.05%) elderly with hypertension presented with different degrees, types, and patterns of hearing loss. Out of this number, 123 (83.11%) presented with bilateral hearing loss, while 25 (16.89%) had unilateral hearing loss. Degree of hearing loss, 74 moderate hearing loss, 118 moderately severe and 50 severe hearing loss. 36% of the hearing loss appeared as flat audiometric configuration, 24% were slopping, 19% were rising, while 21% were tough-shaped audiometric configurations. The findings showed high prevalence of hearing loss among the elderly with hypertension in Southwest, Nigeria. Based on the findings, management of elderly with hypertension should include regular audiological rehabilitation and total adherence to hearing conservation principles, otological management, regulation of blood pressure and adequate counselling / follow-up services.

Keywords: auditory performance, elderly, hearing loss, hypertension

Procedia PDF Downloads 297
4498 Optimal Mitigation of Slopes by Probabilistic Methods

Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez

Abstract:

A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.

Keywords: expected life-cycle cost, failure probability, slopes failure, storms

Procedia PDF Downloads 157
4497 Processing and Characterization of (Pb0.55Ca0.45) (Fe0.5Nb0.5)O3 and (Pb0.45Ca0.55) (Fe0.5Nb0.5) O3 Dielectric Ceramics

Authors: Shalini Bahel, Maalti Puri, Sukhleen Bindra Narang

Abstract:

Ceramic samples of (Pb0.55Ca0.45) (Fe0.5Nb0.5)O3 and (Pb0.45Ca0.55)(Fe0.5Nb0.5)O3 were synthesized by columbite precursor method and characterized for structural and dielectric properties. Both the synthesized samples have perovskite structure with tetragonal symmetry. The variations in relative permittivity and loss tangent were measured as a function of frequency at room temperature. Both the relative permittivity and loss tangent decreased with increase in frequency. A reasonably high value of relative permittivity of 63.46, loss tangent of 0.0067 at 15 MHz and temperature coefficient of relative permittivity of -82 ppm/˚C was obtained for (Pb0.45Ca0.55) (Fe0.5Nb0.5) O3.

Keywords: loss tangent, perovskite, relative permittivity, X-ray diffraction

Procedia PDF Downloads 265
4496 Optical Fiber Data Throughput in a Quantum Communication System

Authors: Arash Kosari, Ali Araghi

Abstract:

A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.

Keywords: absorption, data throughput, depolarization, optical fiber

Procedia PDF Downloads 284
4495 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode

Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan

Abstract:

Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.

Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk

Procedia PDF Downloads 430
4494 Effect on Bandwidth of Using Double Substrates Based Metamaterial Planar Antenna

Authors: Smrity Dwivedi

Abstract:

The present paper has revealed the effect of double substrates over a bandwidth performance for planar antennas. The used material has its own importance to get minimum return loss and improved directivity. The author has taken double substrates to enhance the efficiency in terms of gain of antenna. Metamaterial based antenna has its own specific structure which increased the performance of antenna. Improved return loss is -20 dB, and the voltage standing wave ratio (VSWR) is 1.2, which is better than single substrate having return loss of -15 dB and VSWR of 1.4. Complete results are obtained using commercial software CST microwave studio.

Keywords: CST microwave studio, metamaterial, return loss, VSWR

Procedia PDF Downloads 384
4493 Exploring the Energy Model of Cumulative Grief

Authors: Masica Jordan Alston, Angela N. Bullock, Angela S. Henderson, Stephanie Strianse, Sade Dunn, Joseph Hackett, Alaysia Black Hackett, Marcus Mason

Abstract:

The Energy Model of Cumulative Grief was created in 2018. The Energy Model of Cumulative Grief utilizes historic models of grief stage theories. The innovative model is additionally unique due to its focus on cultural responsiveness. The Energy Model of Cumulative Grief helps to train practitioners who work with clients dealing with grief and loss. This paper assists in introducing the world to this innovative model and exploring how this model positively impacted a convenience sample of 140 practitioners and individuals experiencing grief and loss. Respondents participated in Webinars provided by the National Grief and Loss Center of America (NGLCA). Participants in this cross-sectional research design study completed one of three Grief and Loss Surveys created by the Grief and Loss Centers of America. Data analysis for this study was conducted via SPSS and Survey Hero to examine survey results for respondents. Results indicate that the Energy Model of Cumulative Grief was an effective resource for participants in addressing grief and loss. The majority of participants found the Webinars to be helpful and a conduit to providing them with higher levels of hope. The findings suggest that using The Energy Model of Cumulative Grief is effective in providing culturally responsive grief and loss resources to practitioners and clients. There are far reaching implications with the use of technology to provide hope to those suffering from grief and loss worldwide through The Energy Model of Cumulative Grief.

Keywords: grief, loss, grief energy, grieving brain

Procedia PDF Downloads 79
4492 Ultra-Low Chromatic Dispersion, Low Confinement Loss, and Low Nonlinear Effects Index-Guiding Photonic Crystal Fiber

Authors: S. Olyaee, M. Seifouri, A. Nikoosohbat, M. Shams Esfand Abadi

Abstract:

Photonic Crystal Fibers (PCFs) can be used in optical communications as transmission lines. For this reason, the PCFs with low confinement loss, low chromatic dispersion, and low nonlinear effects are highly suitable transmission media. In this paper, we introduce a new design of index-guiding photonic crystal fiber (IG-PCF) with ultra-low chromatic dispersion, low nonlinearity effects, and low confinement loss. Relatively low dispersion is achieved in the wavelength range of 1200 to 1600 nm using the proposed design. According to the new structure of IG-PCF presented in this study, the chromatic dispersion slope is -30(ps/km.nm) and the confinement loss reaches below 10-7 dB/km. While in the wavelength range mentioned above at the same time an effective area of more than 50.2μm2 is obtained.

Keywords: optical communication systems, index-guiding, dispersion, confinement loss, photonic crystal fiber

Procedia PDF Downloads 603
4491 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 270
4490 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 75
4489 Determinants of Hospital Obstetric Unit Closures in the United States 2002-2013: Loss of Hospital Obstetric Care 2002-2013

Authors: Peiyin Hung, Katy Kozhimannil, Michelle Casey, Ira Moscovice

Abstract:

Background/Objective: The loss of obstetric services has been a pressing concern in urban and rural areas nationwide. This study aims to determine factors that contribute to the loss of obstetric care through closures of a hospital or obstetric unit. Methods: Data from 2002-2013 American Hospital Association annual surveys were used to identify hospitals providing obstetric services. We linked these data to Medicare Healthcare Cost Report Information for hospital financial indicators, the US Census Bureau’s American Community Survey for zip-code level characteristics, and Area Health Resource files for county- level clinician supply measures. A discrete-time multinomial logit model was used to determine contributing factors to obstetric unit or hospital closures. Results: Of 3,551 hospitals providing obstetrics services during 2002-2013, 82% kept units open, 12% stopped providing obstetrics services, and 6% closed down completely. State-level variations existed. Factors that significantly increased hospitals’ probability of obstetric unit closures included lower than 250 annual birth volume (adjusted marginal effects [95% confidence interval]=34.1% [28%, 40%]), closer proximity to another hospital with obstetric services (per 10 miles: -1.5% [-2.4, -0.5%]), being in a county with lower family physician supply (-7.8% [-15.0%, -0.6%), being in a zip code with higher percentage of non-white females (per 10%: 10.2% [2.1%, 18.3%]), and with lower income (per $1,000 income: -0.14% [-0.28%, -0.01%]). Conclusions: Over the past 12 years, loss of obstetric services has disproportionately affected areas served by low-volume urban and rural hospitals, non-white and low-income communities, and counties with fewer family physicians, signaling a need to address maternity care access in these communities.

Keywords: access to care, obstetric care, service line discontinuation, hospital, obstetric unit closures

Procedia PDF Downloads 219
4488 High Power Low Loss CMOS SPDT Antenna Switch for LTE-A Front End Module

Authors: Ki-Jin Kim, Suk-Hui LEE, Sanghoon Park, K. H. Ahn

Abstract:

A high power, low loss asymmetric single pole double through(SPDT) antenna switch for LTE-A Front-End Module(FEM) is presented in this paper by using CMOS technology. For the usage of LTE-A applications, low loss and high linearity are the key features which are very challenging works under CMOS process. To enhance insertion loss(IL) and power handling capability, this paper adopts asymmetric Transmitter (TX) and RX (Receiver) structure, floating body technique, multi-stacked structure, and feed forward capacitor technique. The designed SPDT switch shows TX IL 0.34 dB, RX IL 0.73 dB, P1dB 38.9 dBm at 0.9 GHz and TX IL 0.37 dB, RX IL 0.95 dB, P1dB 39.1 dBm at 2.5 GHz respectively.

Keywords: CMOS switch, SPDT switch, high power CMOS switch, LTE-A FEM

Procedia PDF Downloads 362
4487 CFD Prediction of the Round Elbow Fitting Loss Coefficient

Authors: Ana Paula P. dos Santos, Claudia R. Andrade, Edson L. Zaparoli

Abstract:

Pressure loss in ductworks is an important factor to be considered in design of engineering systems such as power-plants, refineries, HVAC systems to reduce energy costs. Ductwork can be composed by straight ducts and different types of fittings (elbows, transitions, converging and diverging tees and wyes). Duct fittings are significant sources of pressure loss in fluid distribution systems. Fitting losses can be even more significant than equipment components such as coils, filters, and dampers. At the present work, a conventional 90o round elbow under turbulent incompressible airflow is studied. Mass, momentum, and k-e turbulence model equations are solved employing the finite volume method. The SIMPLE algorithm is used for the pressure-velocity coupling. In order to validate the numerical tool, the elbow pressure loss coefficient is determined using the same conditions to compare with ASHRAE database. Furthermore, the effect of Reynolds number variation on the elbow pressure loss coefficient is investigated. These results can be useful to perform better preliminary design of air distribution ductworks in air conditioning systems.

Keywords: duct fitting, pressure loss, elbow, thermodynamics

Procedia PDF Downloads 385
4486 A Succinct Method for Allocation of Reactive Power Loss in Deregulated Scenario

Authors: J. S. Savier

Abstract:

Real power is the component power which is converted into useful energy whereas reactive power is the component of power which cannot be converted to useful energy but it is required for the magnetization of various electrical machineries. If the reactive power is compensated at the consumer end, the need for reactive power flow from generators to the load can be avoided and hence the overall power loss can be reduced. In this scenario, this paper presents a succinct method called JSS method for allocation of reactive power losses to consumers connected to radial distribution networks in a deregulated environment. The proposed method has the advantage that no assumptions are made while deriving the reactive power loss allocation method.

Keywords: deregulation, reactive power loss allocation, radial distribution systems, succinct method

Procedia PDF Downloads 370
4485 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs

Authors: Lokesh Varshney, R. K. Saket

Abstract:

This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.

Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation

Procedia PDF Downloads 554
4484 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 356
4483 Long-Term Otitis Media with Effusion and Related Hearing Loss and Its Impact on Developmental Outcomes

Authors: Aleema Rahman

Abstract:

Introduction: This study aims to estimate the prevalence of long-term otitis media with effusion (OME) and hearing loss in a prospective longitudinal cohort studyand to study the relationship between the condition and educational and psychosocial outcomes. Methods: Analysis of data from the Avon Longitudinal Study of Parents and Children (ALSPAC) will be undertaken. ALSPAC is a longitudinal birth cohort study carried out in the UK, which has collected detailed measures of hearing on ~7000 children from the age of seven. A descriptive analysis of the data will be undertaken to estimate the prevalence of OME and hearing loss (defined as having average hearing levels > 20dB and type B tympanogram) at 7, 9, 11, and 15 years as well as that of long-term OME and hearing loss. Logistic and linear regression analyses will be conducted to examine associations between long-term OME and hearing loss and educational outcomes (grades obtained from standardised national attainment tests) and psychosocial outcomes such as anxiety, social fears, and depression at ages 10-11 and 15-16 years. Results: Results will be presented in terms of the prevalence of OME and hearing loss in the population at each age. The prevalence of long-term OME and hearing loss, defined as having OME and hearing loss at two or more time points, will also be reported. Furthermore, any associations between long-term OME and hearing loss and the educational and psychosocial outcomes will be presented. Analyses will take into account demographic factors such as sex and social deprivation and relevant confounders, including socioeconomic status, ethnicity, and IQ. Discussion: Findings from this study will provide new epidemiological information on the prevalence of long-term OME and hearing loss. The research will provide new knowledge on the impact of OME for the small group of children who do not grow out of condition by age 7 but continue to have hearing loss and need clinical care through later childhood. The study could have clinical implications and may influence service delivery for this group of children.

Keywords: educational attainment, hearing loss, otitis media with effusion, psychosocial development

Procedia PDF Downloads 132
4482 Graphical Modeling of High Dimension Processes with an Environmental Application

Authors: Ali S. Gargoum

Abstract:

Graphical modeling plays an important role in providing efficient probability calculations in high dimensional problems (computational efficiency). In this paper, we address one of such problems where we discuss fragmenting puff models and some distributional assumptions concerning models for the instantaneous, emission readings and for the fragmenting process. A graphical representation in terms of a junction tree of the conditional probability breakdown of puffs and puff fragments is proposed.

Keywords: graphical models, influence diagrams, junction trees, Bayesian nets

Procedia PDF Downloads 394
4481 A Generalized Weighted Loss for Support Vextor Classification and Multilayer Perceptron

Authors: Filippo Portera

Abstract:

Usually standard algorithms employ a loss where each error is the mere absolute difference between the true value and the prediction, in case of a regression task. In the present, we present several error weighting schemes that are a generalization of the consolidated routine. We study both a binary classification model for Support Vextor Classification and a regression net for Multylayer Perceptron. Results proves that the error is never worse than the standard procedure and several times it is better.

Keywords: loss, binary-classification, MLP, weights, regression

Procedia PDF Downloads 89
4480 Blocking of Random Chat Apps at Home Routers for Juvenile Protection in South Korea

Authors: Min Jin Kwon, Seung Won Kim, Eui Yeon Kim, Haeyoung Lee

Abstract:

Numerous anonymous chat apps that help people to connect with random strangers have been released in South Korea. However, they become a serious problem for young people since young people often use them for channels of prostitution or sexual violence. Although ISPs in South Korea are responsible for making inappropriate content inaccessible on their networks, they do not block traffic of random chat apps since 1) the use of random chat apps is entirely legal. 2) it is reported that they use HTTP proxy blocking so that non-HTTP traffic cannot be blocked. In this paper, we propose a service model that can block random chat apps at home routers. A service provider manages a blacklist that contains blocked apps’ information. Home routers that subscribe the service filter the traffic of the apps out using deep packet inspection. We have implemented a prototype of the proposed model, including a centralized server providing the blacklist, a Raspberry Pi-based home router that can filter traffic of the apps out, and an Android app used by the router’s administrator to locally customize the blacklist.

Keywords: deep packet inspection, internet filtering, juvenile protection, technical blocking

Procedia PDF Downloads 345
4479 Cost Analysis of Optimized Fast Network Mobility in IEEE 802.16e Networks

Authors: Seyyed Masoud Seyyedoshohadaei, Borhanuddin Mohd Ali

Abstract:

To support group mobility, the NEMO Basic Support Protocol has been standardized as an extension of Mobile IP that enables an entire network to change its point of attachment to the Internet. Using NEMO in IEEE 802.16e (WiMax) networks causes latency in handover procedure and affects seamless communication of real-time applications. To decrease handover latency and service disruption time, an integrated scheme named Optimized Fast NEMO (OFNEMO) was introduced by authors of this paper. In OFNEMO a pre-establish multi tunnels concept, cross function optimization and cross layer design are used. In this paper, an analytical model is developed to evaluate total cost consisting of signaling and packet delivery costs of the OFNEMO compared with RFC3963. Results show that OFNEMO increases probability of predictive mode compared with RFC3963 due to smaller handover latency. Even though OFNEMO needs extra signalling to pre-establish multi tunnel, it has less total cost thanks to its optimized algorithm. OFNEMO can minimize handover latency for supporting real time application in moving networks.

Keywords: fast mobile IPv6, handover latency, IEEE802.16e, network mobility

Procedia PDF Downloads 192
4478 Reliability Analysis of Glass Epoxy Composite Plate under Low Velocity

Authors: Shivdayal Patel, Suhail Ahmad

Abstract:

Safety assurance and failure prediction of composite material component of an offshore structure due to low velocity impact is essential for associated risk assessment. It is important to incorporate uncertainties associated with material properties and load due to an impact. Likelihood of this hazard causing a chain of failure events plays an important role in risk assessment. The material properties of composites mostly exhibit a scatter due to their in-homogeneity and anisotropic characteristics, brittleness of the matrix and fiber and manufacturing defects. In fact, the probability of occurrence of such a scenario is due to large uncertainties arising in the system. Probabilistic finite element analysis of composite plates due to low-velocity impact is carried out considering uncertainties of material properties and initial impact velocity. Impact-induced damage of composite plate is a probabilistic phenomenon due to a wide range of uncertainties arising in material and loading behavior. A typical failure crack initiates and propagates further into the interface causing de-lamination between dissimilar plies. Since individual crack in the ply is difficult to track. The progressive damage model is implemented in the FE code by a user-defined material subroutine (VUMAT) to overcome these problems. The limit state function is accordingly established while the stresses in the lamina are such that the limit state function (g(x)>0). The Gaussian process response surface method is presently adopted to determine the probability of failure. A comparative study is also carried out for different combination of impactor masses and velocities. The sensitivity based probabilistic design optimization procedure is investigated to achieve better strength and lighter weight of composite structures. Chain of failure events due to different modes of failure is considered to estimate the consequences of failure scenario. Frequencies of occurrence of specific impact hazards yield the expected risk due to economic loss.

Keywords: composites, damage propagation, low velocity impact, probability of failure, uncertainty modeling

Procedia PDF Downloads 272
4477 On Virtual Coordination Protocol towards 5G Interference Mitigation: Modelling and Performance Analysis

Authors: Bohli Afef

Abstract:

The fifth-generation (5G) wireless systems is featured by extreme densities of cell stations to overcome the higher future demand. Hence, interference management is a crucial challenge in 5G ultra-dense cellular networks. In contrast to the classical inter-cell interference coordination approach, which is no longer fit for the high density of cell-tiers, this paper proposes a novel virtual coordination based on the dynamic common cognitive monitor channel protocol to deal with the inter-cell interference issue. A tractable and flexible model for the coverage probability of a typical user is developed through the use of the stochastic geometry model. The analyses of the performance of the suggested protocol are illustrated both analytically and numerically in terms of coverage probability.

Keywords: ultra dense heterogeneous networks, dynamic common channel protocol, cognitive radio, stochastic geometry, coverage probability

Procedia PDF Downloads 324
4476 Effect of Crystallographic Characteristics on Toughness of Coarse Grain Heat Affected Zone for Different Heat Inputs

Authors: Trishita Ray, Ashok Perka, Arnab Karani, M. Shome, Saurabh Kundu

Abstract:

Line pipe steels are used for long distance transportation of crude oil and gas under extreme environmental conditions. Welding is necessary to lay large scale pipelines. Coarse Grain Heat Affected Zone (CGHAZ) of a welded joint exhibits worst toughness because of excessive grain growth and brittle microstructures like bainite and martensite, leading to early failure. Therefore, it is necessary to investigate microstructures and properties of the CGHAZ for different welding heat inputs. In the present study, CGHAZ for two heat inputs of 10 kJ/cm and 50 kJ/cm were simulated in Gleeble 3800, and the microstructures were investigated in detail by means of Scanning Electron Microscopy (SEM) and Electron Backscattered Diffraction (EBSD). Charpy Impact Tests were also done to evaluate the impact properties. High heat input was characterized with very low toughness and massive prior austenite grains. With the crystallographic information from EBSD, the area of a single prior austenite grain was traced out for both the welding conditions. Analysis of the prior austenite grains showed the formation of high angle boundaries between the crystallographic packets. Effect of these packet boundaries on secondary cleavage crack propagation was discussed. It was observed that in the low heat input condition, formation of finer packets with a criss-cross morphology inside prior austenite grains was effective in crack arrest whereas, in the high heat input condition, formation of larger packets with higher volume of low angle boundaries failed to resist crack propagation resulting in a brittle fracture. Thus, the characteristics in a crystallographic packet and impact properties are related and should be controlled to obtain optimum properties.

Keywords: coarse grain heat affected zone, crystallographic packet, toughness, line pipe steel

Procedia PDF Downloads 243
4475 Transient Simulation Using SPACE for ATLAS Facility to Investigate the Effect of Heat Loss on Major Parameters

Authors: Suhib A. Abu-Seini, Kyung-Doo Kim

Abstract:

A heat loss model for ATLAS facility was introduced using SPACE code predefined correlations and various dialing factors. As all previous simulations were carried out using a heat loss free input; the facility was considered to be completely insulated and the core power was reduced by the experimentally measured values of heat loss to compensate to the account for the loss of heat, this study will consider heat loss throughout the simulation. The new heat loss model will be affecting SPACE code simulation as heat being leaked out of the system throughout a transient will alter many parameters corresponding to temperature and temperature difference. For that, a Station Blackout followed by a multiple Steam Generator Tube Rupture accident will be simulated using both the insulated system approach and the newly introduced heat loss input of the steady state. Major parameters such as system temperatures, pressure values, and flow rates to be put into comparison and various analysis will be suggested upon it as the experimental values will not be the reference to validate the expected outcome. This study will not only show the significance of heat loss consideration in the processes of prevention and mitigation of various incidents, design basis and beyond accidents as it will give a detailed behavior of ATLAS facility during both processes of steady state and major transient, but will also present a verification of how credible the data acquired of ATLAS are; since heat loss values for steady state were already mismatched between SPACE simulation results and ATLAS data acquiring system. Acknowledgement- This work was supported by the Korean institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea.

Keywords: ATLAS, heat loss, simulation, SPACE, station blackout, steam generator tube rupture, verification

Procedia PDF Downloads 220
4474 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 134
4473 Exact Solutions for Steady Response of Nonlinear Systems under Non-White Excitation

Authors: Yaping Zhao

Abstract:

In the present study, the exact solutions for the steady response of quasi-linear systems under non-white wide-band random excitation are considered by means of the stochastic averaging method. The non linearity of the systems contains the power-law damping and the cross-product term of the power-law damping and displacement. The drift and diffusion coefficients of the Fokker-Planck-Kolmogorov (FPK) equation after averaging are obtained by a succinct approach. After solving the averaged FPK equation, the joint probability density function and the marginal probability density function in steady state are attained. In the process of resolving, the eigenvalue problem of ordinary differential equation is handled by integral equation method. Some new results are acquired and the novel method to deal with the problems in nonlinear random vibration is proposed.

Keywords: random vibration, stochastic averaging method, FPK equation, transition probability density

Procedia PDF Downloads 501