Search results for: Fuzzy Logic estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2978

Search results for: Fuzzy Logic estimation

1568 Detection and Classification Strabismus Using Convolutional Neural Network and Spatial Image Processing

Authors: Anoop T. R., Otman Basir, Robert F. Hess, Eileen E. Birch, Brooke A. Koritala, Reed M. Jost, Becky Luu, David Stager, Ben Thompson

Abstract:

Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. We developed a two-stage method for strabismus detection and classification based on photographs of the face. The first stage detects the presence or absence of strabismus, and the second stage classifies the type of strabismus. The first stage comprises face detection using Haar cascade, facial landmark estimation, face alignment, aligned face landmark detection, segmentation of the eye region, and detection of strabismus using VGG 16 convolution neural networks. Face alignment transforms the face to a canonical pose to ensure consistency in subsequent analysis. Using facial landmarks, the eye region is segmented from the aligned face and fed into a VGG 16 CNN model, which has been trained to classify strabismus. The CNN determines whether strabismus is present and classifies the type of strabismus (exotropia, esotropia, and vertical deviation). If stage 1 detects strabismus, the eye region image is fed into stage 2, which starts with the estimation of pupil center coordinates using mask R-CNN deep neural networks. Then, the distance between the pupil coordinates and eye landmarks is calculated along with the angle that the pupil coordinates make with the horizontal and vertical axis. The distance and angle information is used to characterize the degree and direction of the strabismic eye misalignment. This model was tested on 100 clinically labeled images of children with (n = 50) and without (n = 50) strabismus. The True Positive Rate (TPR) and False Positive Rate (FPR) of the first stage were 94% and 6% respectively. The classification stage has produced a TPR of 94.73%, 94.44%, and 100% for esotropia, exotropia, and vertical deviations, respectively. This method also had an FPR of 5.26%, 5.55%, and 0% for esotropia, exotropia, and vertical deviation, respectively. The addition of one more feature related to the location of corneal light reflections may reduce the FPR, which was primarily due to children with pseudo-strabismus (the appearance of strabismus due to a wide nasal bridge or skin folds on the nasal side of the eyes).

Keywords: strabismus, deep neural networks, face detection, facial landmarks, face alignment, segmentation, VGG 16, mask R-CNN, pupil coordinates, angle deviation, horizontal and vertical deviation

Procedia PDF Downloads 96
1567 A Modified Estimating Equations in Derivation of the Causal Effect on the Survival Time with Time-Varying Covariates

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

a systematic observation from a defined time of origin up to certain failure or censor is known as survival data. Survival analysis is a major area of interest in biostatistics and biomedical researches. At the heart of understanding, the most scientific and medical research inquiries lie for a causality analysis. Thus, the main concern of this study is to investigate the causal effect of treatment on survival time conditional to the possibly time-varying covariates. The theory of causality often differs from the simple association between the response variable and predictors. A causal estimation is a scientific concept to compare a pragmatic effect between two or more experimental arms. To evaluate an average treatment effect on survival outcome, the estimating equation was adjusted for time-varying covariates under the semi-parametric transformation models. The proposed model intuitively obtained the consistent estimators for unknown parameters and unspecified monotone transformation functions. In this article, the proposed method estimated an unbiased average causal effect of treatment on survival time of interest. The modified estimating equations of semiparametric transformation models have the advantage to include the time-varying effect in the model. Finally, the finite sample performance characteristics of the estimators proved through the simulation and Stanford heart transplant real data. To this end, the average effect of a treatment on survival time estimated after adjusting for biases raised due to the high correlation of the left-truncation and possibly time-varying covariates. The bias in covariates was restored, by estimating density function for left-truncation. Besides, to relax the independence assumption between failure time and truncation time, the model incorporated the left-truncation variable as a covariate. Moreover, the expectation-maximization (EM) algorithm iteratively obtained unknown parameters and unspecified monotone transformation functions. To summarize idea, the ratio of cumulative hazards functions between the treated and untreated experimental group has a sense of the average causal effect for the entire population.

Keywords: a modified estimation equation, causal effect, semiparametric transformation models, survival analysis, time-varying covariate

Procedia PDF Downloads 177
1566 A Real Time Expert System for Decision Support in Nuclear Power Plants

Authors: Andressa dos Santos Nicolau, João P. da S.C Algusto, Claudio Márcio do N. A. Pereira, Roberto Schirru

Abstract:

In case of abnormal situations, the nuclear power plant (NPP) operators must follow written procedures to check the condition of the plant and to classify the type of emergency. In this paper, we proposed a Real Time Expert System in order to improve operator’s performance in case of transient or accident with reactor shutdown. The expert system’s knowledge is based on the sequence of events (SoE) of known accident and two emergency procedures of the Brazilian Pressurized Water Reactor (PWR) NPP and uses two kinds of knowledge representation: rule and logic trees. The results show that the system was able to classify the response of the automatic protection systems, as well as to evaluate the conditions of the plant, diagnosing the type of occurrence, recovery procedure to be followed, indicating the shutdown root cause, and classifying the emergency level.

Keywords: emergence procedure, expert system, operator support, PWR nuclear power plant

Procedia PDF Downloads 333
1565 Fine Characterization of Glucose Modified Human Serum Albumin by Different Biophysical and Biochemical Techniques at a Range

Authors: Neelofar, Khursheed Alam, Jamal Ahmad

Abstract:

Protein modification in diabetes mellitus may lead to early glycation products (EGPs) or amadori product as well as advanced glycation end products (AGEs). Early glycation involves the reaction of glucose with N-terminal and lysyl side chain amino groups to form Schiff’s base which undergoes rearrangements to form more stable early glycation product known as Amadori product. After Amadori, the reactions become more complicated leading to the formation of advanced glycation end products (AGEs) that interact with various AGE receptors, thereby playing an important role in the long-term complications of diabetes. Millard reaction or nonenzymatic glycation reaction accelerate in diabetes due to hyperglycation and alter serum protein’s structure, their normal functions that lead micro and macro vascular complications in diabetic patients. In this study, Human Serum Albumin (HSA) with a constant concentration was incubated with different concentrations of glucose at 370C for a week. At 4th day, Amadori product was formed that was confirmed by colorimetric method NBT assay and TBA assay which both are authenticate early glycation product. Conformational changes in native as well as all samples of Amadori albumin with different concentrations of glucose were investigated by various biophysical and biochemical techniques. Main biophysical techniques hyperchromacity, quenching of fluorescence intensity, FTIR, CD and SDS-PAGE were used. Further conformational changes were observed by biochemical assays mainly HMF formation, fructoseamine, reduction of fructoseamine with NaBH4, carbonyl content estimation, lysine and arginine residues estimation, ANS binding property and thiol group estimation. This study find structural and biochemical changes in Amadori modified HSA with normal to hyperchronic range of glucose with respect to native HSA. When glucose concentration was increased from normal to chronic range biochemical and structural changes also increased. Highest alteration in secondary and tertiary structure and conformation in glycated HSA was observed at the hyperchronic concentration (75mM) of glucose. Although it has been found that Amadori modified proteins is also involved in secondary complications of diabetes as AGEs but very few studies have been done to analyze the conformational changes in Amadori modified proteins due to early glycation. Most of the studies were found on the structural changes in Amadori protein at a particular glucose concentration but no study was found to compare the biophysical and biochemical changes in HSA due to early glycation with a range of glucose concentration at a constant incubation time. So this study provide the information about the biochemical and biophysical changes occur in Amadori modified albumin at a range of glucose normal to chronic in diabetes. Although many implicates currently in use i.e. glycaemic control, insulin treatment and other chemical therapies that can control many aspects of diabetes. However, even with intensive use of current antidiabetic agents more than 50 % of diabetic patient’s type 2 suffers poor glycaemic control and 18 % develop serious complications within six years of diagnosis. Experimental evidence related to diabetes suggests that preventing the nonenzymatic glycation of relevant proteins or blocking their biological effects might beneficially influence the evolution of vascular complications in diabetic patients or quantization of amadori adduct of HSA by authentic antibodies against HSA-EGPs can be used as marker for early detection of the initiation/progression of secondary complications of diabetes. So this research work may be helpful for the same.

Keywords: diabetes mellitus, glycation, albumin, amadori, biophysical and biochemical techniques

Procedia PDF Downloads 273
1564 Assessment of Petrophysical Parameters Using Well Log and Core Data

Authors: Khulud M. Rahuma, Ibrahim B. Younis

Abstract:

Assessment of petrophysical parameters are very essential for reservoir engineer. Three techniques can be used to predict reservoir properties: well logging, well testing, and core analysis. Cementation factor and saturation exponent are very required for calculation, and their values role a great effect on water saturation estimation. In this study a sensitive analysis was performed to investigate the influence of cementation factor and saturation exponent variation applying logs, and core analysis. Measurements of water saturation resulted in a maximum difference around fifteen percent.

Keywords: porosity, cementation factor, saturation exponent, formation factor, water saturation

Procedia PDF Downloads 694
1563 New Technique of Estimation of Charge Carrier Density of Nanomaterials from Thermionic Emission Data

Authors: Dilip K. De, Olukunle C. Olawole, Emmanuel S. Joel, Moses Emetere

Abstract:

A good number of electronic properties such as electrical and thermal conductivities depend on charge carrier densities of nanomaterials. By controlling the charge carrier densities during the fabrication (or growth) processes, the physical properties can be tuned. In this paper, we discuss a new technique of estimating the charge carrier densities of nanomaterials from the thermionic emission data using the newly modified Richardson-Dushman equation. We find that the technique yields excellent results for graphene and carbon nanotube.

Keywords: charge carrier density, nano materials, new technique, thermionic emission

Procedia PDF Downloads 322
1562 The Accuracy of Small Firms at Predicting Their Employment

Authors: Javad Nosratabadi

Abstract:

This paper investigates the difference between firms' actual and expected employment along with the amount of loans invested by them. In addition, it examines the relationship between the amount of loans received by firms and wages. Empirically, using a causal effect estimation and firm-level data from a province in Iran between 2004 and 2011, the results show that there is a range of the loan amount for which firms' expected employment meets their actual one. In contrast, there is a gap between firms' actual and expected employment for any other loan amount. Furthermore, the result shows that there is a positive and significant relationship between the amount of loan invested by firms and wages.

Keywords: expected employment, actual employment, wage, loan

Procedia PDF Downloads 161
1561 Comprehensive Evaluation of Thermal Environment and Its Countermeasures: A Case Study of Beijing

Authors: Yike Lamu, Jieyu Tang, Jialin Wu, Jianyun Huang

Abstract:

With the development of economy and science and technology, the urban heat island effect becomes more and more serious. Taking Beijing city as an example, this paper divides the value of each influence index of heat island intensity and establishes a mathematical model – neural network system based on the fuzzy comprehensive evaluation index of heat island effect. After data preprocessing, the algorithm of weight of each factor affecting heat island effect is generated, and the data of sex indexes affecting heat island intensity of Shenyang City and Shanghai City, Beijing, and Hangzhou City are input, and the result is automatically output by the neural network system. It is of practical significance to show the intensity of heat island effect by visual method, which is simple, intuitive and can be dynamically monitored.

Keywords: heat island effect, neural network, comprehensive evaluation, visualization

Procedia PDF Downloads 134
1560 Comparison of ANFIS Update Methods Using Genetic Algorithm, Particle Swarm Optimization, and Artificial Bee Colony

Authors: Michael R. Phangtriastu, Herriyandi Herriyandi, Diaz D. Santika

Abstract:

This paper presents a comparison of the implementation of metaheuristic algorithms to train the antecedent parameters and consequence parameters in the adaptive network-based fuzzy inference system (ANFIS). The algorithms compared are genetic algorithm (GA), particle swarm optimization (PSO), and artificial bee colony (ABC). The objective of this paper is to benchmark well-known metaheuristic algorithms. The algorithms are applied to several data set with different nature. The combinations of the algorithms' parameters are tested. In all algorithms, a different number of populations are tested. In PSO, combinations of velocity are tested. In ABC, a different number of limit abandonment are tested. Experiments find out that ABC is more reliable than other algorithms, ABC manages to get better mean square error (MSE) than other algorithms in all data set.

Keywords: ANFIS, artificial bee colony, genetic algorithm, metaheuristic algorithm, particle swarm optimization

Procedia PDF Downloads 353
1559 Applying Theory of Self-Efficacy in Intelligent Transportation Systems by Potential Usage of Vehicle as a Sensor

Authors: Aby Nesan Raj, Sumil K. Raj, Sumesh Jayan

Abstract:

The objective of the study is to formulate a self-regulation model that shall enhance the usage of Intelligent Transportation Systems by understanding the theory of self-efficacy. The core logic of the self-regulation model shall monitor driver's behavior based on the situations related to the various sources of Self Efficacy like enactive mastery, vicarious experience, verbal persuasion and physiological arousal in addition to the vehicle data. For this study, four different vehicle data, speed, drowsiness, diagnostic data and surround camera views are considered. This data shall be given to the self-regulation model for evaluation. The oddness, which is the output of self-regulation model, shall feed to Intelligent Transportation Systems where appropriate actions are being taken. These actions include warning to the user as well as the input to the related transportation systems. It is also observed that the usage of vehicle as a sensor reduces the wastage of resource utilization or duplication. Altogether, this approach enhances the intelligence of the transportation systems especially in safety, productivity and environmental performance.

Keywords: emergency management, intelligent transportation system, self-efficacy, traffic management

Procedia PDF Downloads 246
1558 Error Estimation for the Reconstruction Algorithm with Fan Beam Geometry

Authors: Nirmal Yadav, Tanuja Srivastava

Abstract:

Shannon theory is an exact method to recover a band limited signals from its sampled values in discrete implementation, using sinc interpolators. But sinc based results are not much satisfactory for band-limited calculations so that convolution with window function, having compact support, has been introduced. Convolution Backprojection algorithm with window function is an approximation algorithm. In this paper, the error has been calculated, arises due to this approximation nature of reconstruction algorithm. This result will be defined for fan beam projection data which is more faster than parallel beam projection.

Keywords: computed tomography, convolution backprojection, radon transform, fan beam

Procedia PDF Downloads 493
1557 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights

Authors: Olga Kokoulina

Abstract:

Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.

Keywords: algorithms, public interest, trade secrets, transparency

Procedia PDF Downloads 125
1556 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 406
1555 A Generalisation of Pearson's Curve System and Explicit Representation of the Associated Density Function

Authors: S. B. Provost, Hossein Zareamoghaddam

Abstract:

A univariate density approximation technique whereby the derivative of the logarithm of a density function is assumed to be expressible as a rational function is introduced. This approach which extends Pearson’s curve system is solely based on the moments of a distribution up to a determinable order. Upon solving a system of linear equations, the coefficients of the polynomial ratio can readily be identified. An explicit solution to the integral representation of the resulting density approximant is then obtained. It will be explained that when utilised in conjunction with sample moments, this methodology lends itself to the modelling of ‘big data’. Applications to sets of univariate and bivariate observations will be presented.

Keywords: density estimation, log-density, moments, Pearson's curve system

Procedia PDF Downloads 282
1554 Improving the Quantification Model of Internal Control Impact on Banking Risks

Authors: M. Ndaw, G. Mendy, S. Ouya

Abstract:

Risk management in banking sector is a key issue linked to financial system stability and its importance has been elevated by technological developments and emergence of new financial instruments. In this paper, we improve the model previously defined for quantifying internal control impact on banking risks by automatizing the residual criticality estimation step of FMECA. For this, we defined three equations and a maturity coefficient to obtain a mathematical model which is tested on all banking processes and type of risks. The new model allows an optimal assessment of residual criticality and improves the correlation rate that has become 98%.

Keywords: risk, control, banking, FMECA, criticality

Procedia PDF Downloads 334
1553 Cost Overrun Causes in Public Construction Projects in Saudi Arabia

Authors: Ibrahim Mahamid, A. Al-Ghonamy, M. Aichouni

Abstract:

This study is conducted to identify causes of cost deviations in public construction projects in Saudi Arabia from contractors’ perspective. 41 factors that might affect cost estimating accuracy were identified through literature review and discussion with some construction experts. The factors were tabulated in a questionnaire form and a field survey included 51 contractors from the Northern Province of Saudi Arabia was performed. The results show that the top five important causes are: wrong estimation method, long period between design and time of implementation, cost of labor, cost of machinary and absence of construction-cost data.

Keywords: cost deviation, public construction, cost estimating, Saudi Arabia, contractors

Procedia PDF Downloads 480
1552 Analyze of Nanoscale Materials and Devices for Future Communication and Telecom Networks in the Gas Refinery

Authors: Mohamad Bagher Heidari, Hefzollah Mohammadian

Abstract:

New discoveries in materials on the nanometer-length scale are expected to play an important role in addressing ongoing and future challenges in the field of communication. Devices and systems for ultra-high speed short and long range communication links, portable and power efficient computing devices, high-density memory and logics, ultra-fast interconnects, and autonomous and robust energy scavenging devices for accessing ambient intelligence and needed information will critically depend on the success of next-generation emerging nonmaterials and devices. This article presents some exciting recent developments in nonmaterials that have the potential to play a critical role in the development and transformation of future intelligent communication and telecom networks in the gas refinery. The industry is benefiting from nanotechnology advances with numerous applications including those in smarter sensors, logic elements, computer chips, memory storage devices, optoelectronics.

Keywords: nonmaterial, intelligent communication, nanoscale, nanophotonic, telecom

Procedia PDF Downloads 334
1551 A Wireless Feedback Control System as a Base of Bio-Inspired Structure System to Mitigate Vibration in Structures

Authors: Gwanghee Heo, Geonhyeok Bang, Chunggil Kim, Chinok Lee

Abstract:

This paper attempts to develop a wireless feedback control system as a primary step eventually toward a bio-inspired structure system where inanimate structure behaves like a life form autonomously. It is a standalone wireless control system which is supposed to measure externally caused structural responses, analyze structural state from acquired data, and take its own action on the basis of the analysis with an embedded logic. For an experimental examination of its effectiveness, we applied it on a model of two-span bridge and performed a wireless control test. Experimental tests have been conducted for comparison on both the wireless and the wired system under the conditions of Un-control, Passive-off, Passive-on, and Lyapunov control algorithm. By proving the congruence of the test result of the wireless feedback control system with the wired control system, its control performance was proven to be effective. Besides, it was found to be economical in energy consumption and also autonomous by means of a command algorithm embedded into it, which proves its basic capacity as a bio-inspired system.

Keywords: structural vibration control, wireless system, MR damper, feedback control, embedded system

Procedia PDF Downloads 213
1550 Single Event Transient Tolerance Analysis in 8051 Microprocessor Using Scan Chain

Authors: Jun Sung Go, Jong Kang Park, Jong Tae Kim

Abstract:

As semi-conductor manufacturing technology evolves; the single event transient problem becomes more significant issue. Single event transient has a critical impact on both combinational and sequential logic circuits, so it is important to evaluate the soft error tolerance of the circuits at the design stage. In this paper, we present a soft error detecting simulation using scan chain. The simulation model generates a single event transient randomly in the circuit, and detects the soft error during the execution of the test patterns. We verified this model by inserting a scan chain in an 8051 microprocessor using 65 nm CMOS technology. While the test patterns generated by ATPG program are passing through the scan chain, we insert a single event transient and detect the number of soft errors per sub-module. The experiments show that the soft error rates per cell area of the SFR module is 277% larger than other modules.

Keywords: scan chain, single event transient, soft error, 8051 processor

Procedia PDF Downloads 348
1549 Adaptive Nonparametric Approach for Guaranteed Real-Time Detection of Targeted Signals in Multichannel Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

An adaptive nonparametric method is proposed for stable real-time detection of seismoacoustic sources in multichannel C-OTDR systems with a significant number of channels. This method guarantees given upper boundaries for probabilities of Type I and Type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDR-system are presented in this report.

Keywords: guaranteed detection, multichannel monitoring systems, change point, interval estimation, adaptive detection

Procedia PDF Downloads 449
1548 Teaching Computer Programming to Diverse Students: A Comparative, Mixed-Methods, Classroom Research Study

Authors: Almudena Konrad, Tomás Galguera

Abstract:

Lack of motivation and interest is a serious obstacle to students’ learning computing skills. A need exists for a knowledge base on effective pedagogy and curricula to teach computer programming. This paper presents results from research evaluating a six-year project designed to teach complex concepts in computer programming collaboratively, while supporting students to continue developing their computer thinking and related coding skills individually. Utilizing a quasi-experimental, mixed methods design, the pedagogical approaches and methods were assessed in two contrasting groups of students with different socioeconomic status, gender, and age composition. Analyses of quantitative data from Likert-scale surveys and an evaluation rubric, combined with qualitative data from reflective writing exercises and semi-structured interviews yielded convincing evidence of the project’s success at both teaching and inspiring students.

Keywords: computational thinking, computing education, computer programming curriculum, logic, teaching methods

Procedia PDF Downloads 316
1547 Modeling the Impact of Controls on Information System Risks

Authors: M. Ndaw, G. Mendy, S. Ouya

Abstract:

Information system risk management helps to reduce or eliminate risk by implementing appropriate controls. In this paper, we propose a quantification model of controls impact on information system risks by automatizing the residual criticality estimation step of FMECA which is based on a inductive reasoning. For this, we defined three equations based on type and maturity of controls. For testing, the values obtained with the model were compared to estimated values given by interlocutors during different working sessions and the result is satisfactory. This model allows an optimal assessment of controls maturity and facilitates risk analysis of information system.

Keywords: information system, risk, control, FMECA method

Procedia PDF Downloads 355
1546 Confidence Intervals for Quantiles in the Two-Parameter Exponential Distributions with Type II Censored Data

Authors: Ayman Baklizi

Abstract:

Based on type II censored data, we consider interval estimation of the quantiles of the two-parameter exponential distribution and the difference between the quantiles of two independent two-parameter exponential distributions. We derive asymptotic intervals, Bayesian, as well as intervals based on the generalized pivot variable. We also include some bootstrap intervals in our comparisons. The performance of these intervals is investigated in terms of their coverage probabilities and expected lengths.

Keywords: asymptotic intervals, Bayes intervals, bootstrap, generalized pivot variables, two-parameter exponential distribution, quantiles

Procedia PDF Downloads 417
1545 Ultrasonographic Manifestation of Periventricular Leukomalacia in Preterm Neonates at Teaching Hospital Peradeniya, Sri Lanka

Authors: P. P. Chandrasekera, P. B. Hewavithana, S. Rosairo, M. H. M. N. Herath, D. M. R. D. Mirihella

Abstract:

Periventricular Leukomalacia (PVL) is a White Matter Injury (WMI) of preterm neonatal brain. Objectives of the study were to assess the neuro-developmental outcome at one year of age and to determine a good protocol of cranial ultrasonography to detect PVL. Two hundred and sixty four preterm neonates were included in the study. Series of cranial ultrasound scans were done by using a dedicated neonatal head probe 4-10 MHz of Logic e portable ultrasound scanner. Clinical history of seizures, abnormal head growth (hydrocephalus or microcephaly) and developmental milestones were assessed and neurological examinations were done until one year of age. Among live neonates, 57% who had cystic PVL (Grades2 and 3) manifested as cerebral palsy. In conclusion cystic PVL has permanent neurological disabilities like cerebral palsy. Good protocol of real time cranial ultrasonography to detect PVL is to perform scans at least once a week until one month and at term (40 weeks of gestation).

Keywords: cerebral palsy, cranial ultrasonography, Periventricular Leukomalacia, preterm neonates

Procedia PDF Downloads 394
1544 Employment Mobility and the Effects of Wage Level and Tenure

Authors: Idit Kalisher, Israel Luski

Abstract:

One result of the growing dynamicity of labor markets in recent decades is a wider scope of employment mobility – i.e., transitions between employers, either within or between careers. Employment mobility decisions are primarily affected by the current employment status of the worker, which is reflected in wage and tenure. Using 34,328 observations from the National Longitudinal Survey of Youth 1979 (NLS79), which were derived from the USA population between 1990 and 2012, this paper aims to investigate the effects of wage and tenure over employment mobility choices, and additionally to examine the effects of other personal characteristics, individual labor market characteristics and macroeconomic factors. The estimation strategy was designed to address two challenges that arise from the combination of the model and the data: (a) endogeneity of the wage and the tenure in the choice equation; and (b) unobserved heterogeneity, as the data of this research is longitudinal. To address (a), estimation was performed using two-stage limited dependent variable procedure (2SLDV); and to address (b), the second stage was estimated using femlogit – an implementation of the multinomial logit model with fixed effects. Among workers who have experienced at least one turnover, the wage was found to have a main effect on career turnover likelihood of all workers, whereas the wage effect on job turnover likelihood was found to be dependent on individual characteristics. The wage was found to negatively affect the turnover likelihood and the effect was found to vary across wage level: high-wage workers were more affected compared to low-wage workers. Tenure was found to have a main positive effect on both turnover types’ likelihoods, though the effect was moderated by the wage. The findings also reveal that as their wage increases, women are more likely to turnover than men, and academically educated workers are more likely to turnover within careers. Minorities were found to be as likely as Caucasians to turnover post wage-increase, but less likely to turnover with each additional tenure year. The wage and the tenure effects were found to vary also between careers. The difference in attitude towards money, labor market opportunities and risk aversion could explain these findings. Additionally, the likelihood of a turnover was found to be affected by previous unemployment spells, age, and other labor market and personal characteristics. The results of this research could assist policymakers as well as business owners and employers. The former may be able to encourage women and older workers’ employment by considering the effects of gender and age on the probability of a turnover, and the latter may be able to assess their employees’ likelihood of a turnover by considering the effects of their personal characteristics.

Keywords: employment mobility, endogeneity, femlogit, turnover

Procedia PDF Downloads 152
1543 Breast Cancer Incidence Estimation in Castilla-La Mancha (CLM) from Mortality and Survival Data

Authors: C. Romero, R. Ortega, P. Sánchez-Camacho, P. Aguilar, V. Segur, J. Ruiz, G. Gutiérrez

Abstract:

Introduction: Breast cancer is a leading cause of death in CLM. (2.8% of all deaths in women and 13,8% of deaths from tumors in womens). It is the most tumor incidence in CLM region with 26.1% from all tumours, except nonmelanoma skin (Cancer Incidence in Five Continents, Volume X, IARC). Cancer registries are a good information source to estimate cancer incidence, however the data are usually available with a lag which makes difficult their use for health managers. By contrast, mortality and survival statistics have less delay. In order to serve for resource planning and responding to this problem, a method is presented to estimate the incidence of mortality and survival data. Objectives: To estimate the incidence of breast cancer by age group in CLM in the period 1991-2013. Comparing the data obtained from the model with current incidence data. Sources: Annual number of women by single ages (National Statistics Institute). Annual number of deaths by all causes and breast cancer. (Mortality Registry CLM). The Breast cancer relative survival probability. (EUROCARE, Spanish registries data). Methods: A Weibull Parametric survival model from EUROCARE data is obtained. From the model of survival, the population and population data, Mortality and Incidence Analysis MODel (MIAMOD) regression model is obtained to estimate the incidence of cancer by age (1991-2013). Results: The resulting model is: Ix,t = Logit [const + age1*x + age2*x2 + coh1*(t – x) + coh2*(t-x)2] Where: Ix,t is the incidence at age x in the period (year) t; the value of the parameter estimates is: const (constant term in the model) = -7.03; age1 = 3.31; age2 = -1.10; coh1 = 0.61 and coh2 = -0.12. It is estimated that in 1991 were diagnosed in CLM 662 cases of breast cancer (81.51 per 100,000 women). An estimated 1,152 cases (112.41 per 100,000 women) were diagnosed in 2013, representing an increase of 40.7% in gross incidence rate (1.9% per year). The annual average increases in incidence by age were: 2.07% in women aged 25-44 years, 1.01% (45-54 years), 1.11% (55-64 years) and 1.24% (65-74 years). Cancer registries in Spain that send data to IARC declared 2003-2007 the average annual incidence rate of 98.6 cases per 100,000 women. Our model can obtain an incidence of 100.7 cases per 100,000 women. Conclusions: A sharp and steady increase in the incidence of breast cancer in the period 1991-2013 is observed. The increase was seen in all age groups considered, although it seems more pronounced in young women (25-44 years). With this method you can get a good estimation of the incidence.

Keywords: breast cancer, incidence, cancer registries, castilla-la mancha

Procedia PDF Downloads 312
1542 Multimodal Discourse, Logic of the Analysis of Transmedia Strategies

Authors: Bianca Suárez Puerta

Abstract:

Multimodal discourse refers to a method of study the media continuum between reality, screens as a device, audience, author, and media as a production from the audience. For this study we used semantic differential, a method proposed in the sixties by Osgood, Suci and Tannenbaum, starts from the assumption that under each particular way of perceiving the world, in each singular idea, there is a common cultural meaning that organizes experiences. In relation to these shared symbolic dimension, this method has had significant results, as it focuses on breaking down the meaning of certain significant acts into series of statements that place the subjects in front of some concepts. In Colombia, in 2016, a tool was designed to measure the meaning of a multimodal production, specially the acts of sense of transmedia productions that managed to receive funds from the Ministry of ICT of Colombia, and also, to analyze predictable patterns that can be found in calls and funds aimed at the production of culture in Colombia, in the context of the peace agreement, as a request for expressions from a hegemonic place, seeking to impose a worldview.

Keywords: semantic differential, semiotics, transmedia, critical analysis of discourse

Procedia PDF Downloads 207
1541 Efficacy of Agrobacterium Tumefaciens as a Possible Entomopathogenic Agent

Authors: Fouzia Qamar, Shahida Hasnain

Abstract:

The objective of the present study was to evaluate the possible role of Agrobacterium tumefaciens as a possible insect biocontrol agent. Pests selected for the present challenge were adult males of Periplaneta americana and last instar larvae of Pieris brassicae and Spodoptera litura. Different ranges of bacterial doses were selected and tested to score the mortalities of the insects after 24 hours, for the lethal dose estimation studies. Mode of application for the inoculation of the bacteria, was the microinjection technique. The evaluation of the possible entomopathogenic carrying attribute of bacterial Ti plasmid, led to the conclusion that the loss of plasmid was associated with the loss of virulence against target insects.

Keywords: agrobacterium tumefaciens, toxicity assessment, biopesticidal attribute, entomopathogenic agent

Procedia PDF Downloads 380
1540 Kinetic Studies on CO₂ Gasification of Low and High Ash Indian Coals in Context of Underground Coal Gasification

Authors: Geeta Kumari, Prabu Vairakannu

Abstract:

Underground coal gasification (UCG) technology is an efficient and an economic in-situ clean coal technology, which converts unmineable coals into calorific valuable gases. This technology avoids ash disposal, coal mining, and storage problems. CO₂ gas can be a potential gasifying medium for UCG. CO₂ is a greenhouse gas and, the liberation of this gas to the atmosphere from thermal power plant industries leads to global warming. Hence, the capture and reutilization of CO₂ gas are crucial for clean energy production. However, the reactivity of high ash Indian coals with CO₂ needs to be assessed. In the present study, two varieties of Indian coals (low ash and high ash) are used for thermogravimetric analyses (TGA). Two low ash north east Indian coals (LAC) and a typical high ash Indian coal (HAC) are procured from the coal mines of India. Low ash coal with 9% ash (LAC-1) and 4% ash (LAC-2) and high ash coal (HAC) with 42% ash are used for the study. TGA studies are carried out to evaluate the activation energy for pyrolysis and gasification of coal under N₂ and CO₂ atmosphere. Coats and Redfern method is used to estimate the activation energy of coal under different temperature regimes. Volumetric model is assumed for the estimation of the activation energy. The activation energy estimated under different temperature range. The inherent properties of coals play a major role in their reactivity. The results show that the activation energy decreases with the decrease in the inherent percentage of coal ash due to the ash layer hindrance. A reverse trend was observed with volatile matter. High volatile matter of coal leads to the estimation of low activation energy. It was observed that the activation energy under CO₂ atmosphere at 400-600°C is less as compared to N₂ inert atmosphere. At this temperature range, it is estimated that 15-23% reduction in the activation energy under CO₂ atmosphere. This shows the reactivity of CO₂ gas with higher hydrocarbons of the coal volatile matters. The reactivity of CO₂ with the volatile matter of coal might occur through dry reforming reaction in which CO₂ reacts with higher hydrocarbon, aromatics of the tar content. The observed trend of Ea in the temperature range of 150-200˚C and 400-600˚C is HAC > LAC-1 >LAC-2 in both N₂ and CO₂ atmosphere. At the temperature range of 850-1000˚C, higher activation energy is estimated when compared to those values in the temperature range of 400-600°C. Above 800°C, char gasification through Boudouard reaction progressed under CO₂ atmosphere. It was observed that 8-20 kJ/mol of activation energy is increased during char gasification above 800°C compared to volatile matter pyrolysis between the temperature ranges of 400-600°C. The overall activation energy of the coals in the temperature range of 30-1000˚C is higher in N₂ atmosphere than CO₂ atmosphere. It can be concluded that higher hydrocarbons such as tar effectively undergoes cracking and reforming reactions in presence of CO₂. Thus, CO₂ gas is beneficial for the production of high calorific value syngas using high ash Indian coals.

Keywords: clean coal technology, CO₂ gasification, activation energy, underground coal gasification

Procedia PDF Downloads 172
1539 Exploring Open Innovation Practice in Start-Ups within an Innovation Ecosystem

Authors: Yassine Mehros, Jean-Michel Degeorge, Abdelaziz Elabjani

Abstract:

Innovation has long been considered the key to the survival, development, and growth of companies. It is a process in which start-ups play a key role, but they suffer from a structural lack of resources, which hinders the development of new innovations and their commercialization. The use of alternative channels to access resources is therefore becoming a necessity to overcome this constraint and identify opportunities. This is why they can be part of large communities of interdependent actors, namely innovation ecosystems that are part of a logic of sharing and open innovation. This research aims to explore and better understand OI in start-ups within an innovation ecosystem. We offer an exploratory qualitative study with start-ups and other actors in the Saint-Étienne innovation ecosystem. Our paper explored the characteristics and main actors of the Saint-Etienne innovation ecosystem, focusing on start-ups. We have identified the motivations of start-up’s adopting OI, its difficulties, its risks, and its impact on their growth. Also, our results show the existence of strong links between the different actors in the ecosystem. In addition, a strong trust has been established between these actors thanks to the geographical proximity; the start-ups manage to get in touch with the different actors of their innovation ecosystem by practicing OI. The actors collaborate on projects involving companies and, in particular, start-ups.

Keywords: open innovation, start-ups, Innovation ecosystem, actors

Procedia PDF Downloads 78