Search results for: parametric survival models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8161

Search results for: parametric survival models

6421 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models

Authors: Benbiao Song, Yan Gao, Zhuo Liu

Abstract:

Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.

Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram

Procedia PDF Downloads 252
6420 Construction of Ovarian Cancer-on-Chip Model by 3D Bioprinting and Microfluidic Techniques

Authors: Zakaria Baka, Halima Alem

Abstract:

Cancer is a major worldwide health problem that has caused around ten million deaths in 2020. In addition, efforts to develop new anti-cancer drugs still face a high failure rate. This is partly due to the lack of preclinical models that recapitulate in-vivo drug responses. Indeed conventional cell culture approach (known as 2D cell culture) is far from reproducing the complex, dynamic and three-dimensional environment of tumors. To set up more in-vivo-like cancer models, 3D bioprinting seems to be a promising technology due to its ability to achieve 3D scaffolds containing different cell types with controlled distribution and precise architecture. Moreover, the introduction of microfluidic technology makes it possible to simulate in-vivo dynamic conditions through the so-called “cancer-on-chip” platforms. Whereas several cancer types have been modeled through the cancer-on-chip approach, such as lung cancer and breast cancer, only a few works describing ovarian cancer models have been described. The aim of this work is to combine 3D bioprinting and microfluidic technics with setting up a 3D dynamic model of ovarian cancer. In the first phase, alginate-gelatin hydrogel containing SKOV3 cells was used to achieve tumor-like structures through an extrusion-based bioprinter. The desired form of the tumor-like mass was first designed on 3D CAD software. The hydrogel composition was then optimized for ensuring good and reproducible printability. Cell viability in the bioprinted structures was assessed using Live/Dead assay and WST1 assay. In the second phase, these bioprinted structures will be included in a microfluidic device that allows simultaneous testing of different drug concentrations. This microfluidic dispositive was first designed through computational fluid dynamics (CFD) simulations for fixing its precise dimensions. It was then be manufactured through a molding method based on a 3D printed template. To confirm the results of CFD simulations, doxorubicin (DOX) solutions were perfused through the dispositive and DOX concentration in each culture chamber was determined. Once completely characterized, this model will be used to assess the efficacy of anti-cancer nanoparticles developed in the Jean Lamour institute.

Keywords: 3D bioprinting, ovarian cancer, cancer-on-chip models, microfluidic techniques

Procedia PDF Downloads 184
6419 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models

Authors: Tony Mann

Abstract:

This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.

Keywords: facilitation, stakeholders, buy-in, digital workshops

Procedia PDF Downloads 91
6418 Investigating the Use of Advanced Manufacturing Technologies in the Assembly Type Manufacturing Companies in Trinidad and Tobago

Authors: Nadine Sangster, Akil James, Rondell Duke, Aaron Ameerali, Terrence Lalla

Abstract:

The market place of the 21st century is evolving into one of merging national markets, fragmented consumer markets, and rapidly changing product technologies. The use of new technologies has become vital to the manufacturing industry for their survival and sustainability. This work focused on the assembly type industry in a small developing country and aimed at identifying the use of advanced manufacturing technologies and their impact on this sector of the manufacturing industry. It was found that some technologies were being used and that they had improved the effectiveness of those companies but there was still quite a bit of room for improvements. Some of the recommendations included benchmarking against international standards, the adoption of a “made in TT” campaign and the effective utilisation of the technologies to improve manufacturing effectiveness and thus improve competitive advantages and strategies.

Keywords: advanced manufacturing technology, Trinidad and Tobago, manufacturing, industrial engineering

Procedia PDF Downloads 483
6417 Maintaining the Tension between the Classic Seduction Theory and the Role of Unconscious Fantasies

Authors: Galit Harel

Abstract:

This article describes the long-term psychoanalytic psychotherapy of a young woman who had experienced trauma during her childhood. The details of the trauma were unknown, as all memory of the trauma had been repressed. Past trauma is analyzable through a prism of transference, dreaming and dreams, mental states, and thinking processes that offer an opportunity to explore and analyze the influence of both reality and fantasy on the patient. The presented case describes a therapeutic process that strives to discover hidden meanings through the unconscious system and illustrates the movement from unconscious to conscious during exploration of the patient’s personal trauma in treatment. The author discusses the importance of classical and contemporary psychoanalytic models of childhood sexual trauma through the discovery of manifest and latent content, unconscious fantasies, and actual events of trauma. It is suggested that the complexity of trauma is clarified by the tension between these models and by the inclusion of aspects of both of them for a complete understanding.

Keywords: dreams, psychoanalytic psychotherapy, thinking processes, transference, trauma

Procedia PDF Downloads 75
6416 Quality of the Ruin Probabilities Approximation Using the Regenerative Processes Approach regarding to Large Claims

Authors: Safia Hocine, Djamil Aïssani

Abstract:

Risk models, recently studied in the literature, are becoming increasingly complex. It is rare to find explicit analytical relations to calculate the ruin probability. Indeed, the stability issue occurs naturally in ruin theory, when parameters in risk cannot be estimated than with uncertainty. However, in most cases, there are no explicit formulas for the ruin probability. Hence, the interest to obtain explicit stability bounds for these probabilities in different risk models. In this paper, we interest to the stability bounds of the univariate classical risk model established using the regenerative processes approach. By adopting an algorithmic approach, we implement this approximation and determine numerically the bounds of ruin probability in the case of large claims (heavy-tailed distribution).

Keywords: heavy-tailed distribution, large claims, regenerative process, risk model, ruin probability, stability

Procedia PDF Downloads 348
6415 Message Passing Neural Network (MPNN) Approach to Multiphase Diffusion in Reservoirs for Well Interconnection Assessments

Authors: Margarita Mayoral-Villa, J. Klapp, L. Di G. Sigalotti, J. E. V. Guzmán

Abstract:

Automated learning techniques are widely applied in the energy sector to address challenging problems from a practical point of view. To this end, we discuss the implementation of a Message Passing algorithm (MPNN)within a Graph Neural Network(GNN)to leverage the neighborhood of a set of nodes during the aggregation process. This approach enables the characterization of multiphase diffusion processes in the reservoir, such that the flow paths underlying the interconnections between multiple wells may be inferred from previously available data on flow rates and bottomhole pressures. The results thus obtained compare favorably with the predictions produced by the Reduced Order Capacitance-Resistance Models (CRM) and suggest the potential of MPNNs to enhance the robustness of the forecasts while improving the computational efficiency.

Keywords: multiphase diffusion, message passing neural network, well interconnection, interwell connectivity, graph neural network, capacitance-resistance models

Procedia PDF Downloads 134
6414 Daily Stress, Family Functioning, and Mental Health among Palestinian Couples in Israel During COVID-19: A Moderated Mediation Model

Authors: Niveen M. Hassan-Abbas

Abstract:

The COVID-19 pandemic created a range of stressors, among them difficulties related to work conditions, financial changes, lack of childcare, and confinement or isolation due to social distancing. Among families and married individuals, these stressors were often expressed in additional daily hassles, with an influence on mental health. This study examined two moderated mediation models based on Bodenmann’s systemic-transactional stress model. Specifically, the models tested the hypothesis that intra-dyadic stress mediates the association between extra-dyadic stress and mental health, while two measures of family functioning, cohesion, and flexibility, moderate the relationship between extra and intra-dyadic stress. Participants were 480 heterosexual married Palestinians from Israel who completed self-report questionnaires. The results showed partial mediation patterns supporting both models, indicating that family cohesion and flexibility weakened the mediating effect of intra-dyadic stress on the relationship between extra-dyadic stress and mental health. These findings increase our understanding of the variables that affected mental health during the pandemic and suggested that when faced with extra-dyadic stress, married individuals with good family environments are less likely to experience high levels of intra-dyadic stress, which is in turn associated with preserved mental health. Limitations and implications for planning interventions for couples and families during the pandemic are discussed.

Keywords: Palestinian families in Israel, COVID-19 pandemic, family cohesion and flexibility, extra-dyadic stress, intra-dyadic stress, mental health

Procedia PDF Downloads 84
6413 Modeling the Cyclic Behavior of High Damping Rubber Bearings

Authors: Donatello Cardone

Abstract:

Bilinear hysteresis models are usually used to describe the cyclic behavior of high damping rubber bearings. However, they neglect a number of phenomena (such as the interaction between axial load and shear force, buckling and post-buckling behavior, cavitation, scragging effects, etc.) that can significantly influence the dynamic behavior of such isolation devices. In this work, an advanced hysteresis model is examined and properly calibrated using consolidated procedures. Results of preliminary numerical analyses, performed in OpenSees, are shown and compared with the results of experimental tests on high damping rubber bearings and simulation analyses using alternative nonlinear models. The findings of this study can provide an useful tool for the accurate evaluation of the seismic response of structures with rubber-based isolation systems.

Keywords: seismic isolation, high damping rubber bearings, numerical modeling, axial-shear force interaction

Procedia PDF Downloads 116
6412 TNF-Kinoid® in Autoimmune Diseases

Authors: Yahia Massinissa, Melakhessou Med Akram, Mezahdia Mehdi, Marref Salah Eddine

Abstract:

Cytokines are natural proteins which act as true intercellular communication signals in immune and inflammatory responses. Reverse signaling pathways that activate cytokines help to regulate different functions at the target cell, causing its activation, its proliferation, the differentiation, its survival or death. It was shown that malfunctioning of the cytokine regulation, particularly over-expression, contributes to the onset and development of certain serious diseases such as chronic rheumatoid arthritis, Crohn's disease, psoriasis, lupus. The action mode of Kinoid® technology is based on the principle vaccine: The patient's immune system is activated so that it neutralizes itself and the factor responsible for the disease. When applied specifically to autoimmune diseases, therapeutic vaccination allows the body to neutralize cytokines (proteins) overproduced through a highly targeted stimulation of the immune system.

Keywords: cytokines, Kinoid tech, auto-immune diseases, vaccination

Procedia PDF Downloads 322
6411 A Decision Support Framework for Introducing Business Intelligence to Midlands Based SMEs

Authors: Amritpal Slaich, Mark Elshaw

Abstract:

This paper explores the development of a decision support framework for the introduction of business intelligence (BI) through operational research techniques for application by SMEs. Aligned with the goals of the new Midlands Enterprise Initiative of improving the skill levels of the Midlands workforce and addressing high levels of regional unemployment, we have developed a framework to increase the level of business intelligence used by SMEs to improve business decision-making. Many SMEs in the Midlands fail due to the lack of high quality decision making. Our framework outlines how universities can: engage with SMEs in the use of BI through operational research techniques; develop appropriate and easy to use Excel spreadsheet models; and make use of a process to allow SMEs to feedback their findings of the models. Future work will determine how well the framework performs in getting SMEs to apply BI to improve their decision-making performance.

Keywords: SMEs, decision support framework, business intelligence, operational research techniques

Procedia PDF Downloads 454
6410 Planning a Supply Chain with Risk and Environmental Objectives

Authors: Ghanima Al-Sharrah, Haitham M. Lababidi, Yusuf I. Ali

Abstract:

The main objective of the current work is to introduce sustainability factors in optimizing the supply chain model for process industries. The supply chain models are normally based on purely economic considerations related to costs and profits. To account for sustainability, two additional factors have been introduced; environment and risk. A supply chain for an entire petroleum organization has been considered for implementing and testing the proposed optimization models. The environmental and risk factors were introduced as indicators reflecting the anticipated impact of the optimal production scenarios on sustainability. The aggregation method used in extending the single objective function to multi-objective function is proven to be quite effective in balancing the contribution of each objective term. The results indicate that introducing sustainability factor would slightly reduce the economic benefit while improving the environmental and risk reduction performances of the process industries.

Keywords: environmental indicators, optimization, risk, supply chain

Procedia PDF Downloads 338
6409 A Unique Immunization Card for Early Detection of Retinoblastoma

Authors: Hiranmoyee Das

Abstract:

Aim. Due to late presentation and delayed diagnosis mortality rate of retinoblastoma is more than 50% in developing counties. So to facilitate the diagnosis, to decrease the disease and treatment burden and to increase the disease survival rate, an attempt was made for early diagnosis of Retinoblastoma by including fundus examination in routine immunization programs. Methods- A unique immunization card is followed in a tertiary health care center where examination of pupillary reflex is made mandatory in each visit of the child for routine immunization. In case of any abnormality, the child is referred to the ophthalmology department. Conclusion- Early detection is the key in the management of retinoblastoma. Every child is brought to the health care system at least five times before the age of 2 years for routine immunization. We should not miss this golden opportunity for early detection of retinoblastoma.

Keywords: retinoblastoma, immunization, unique, early

Procedia PDF Downloads 185
6408 Multidimensional Sports Spectators Segmentation and Social Media Marketing

Authors: B. Schmid, C. Kexel, E. Djafarova

Abstract:

Understanding consumers is elementary for practitioners in marketing. Consumers of sports events, the sports spectators, are a particularly complex consumer crowd. In order to identify and define their profiles different segmentation approaches can be found in literature, one of them being multidimensional segmentation. Multidimensional segmentation models correspond to the broad range of attitudes, behaviours, motivations and beliefs of sports spectators, other than earlier models. Moreover, in sports there are some well-researched disciplines (e.g. football or North American sports) where consumer profiles and marketing strategies are elaborate and others where no research at all can be found. For example, there is almost no research on athletics spectators. This paper explores the current state of research on sports spectators segmentation. An in-depth literature review provides the framework for a spectators segmentation in athletics. On this basis, additional potential consumer groups and implications for social media marketing will be explored. The findings are the basis for further research.

Keywords: multidimensional segmentation, social media, sports marketing, sports spectators segmentation

Procedia PDF Downloads 295
6407 Prediction of Solidification Behavior of Al Alloy in a Cube Mold Cavity

Authors: N. P. Yadav, Deepti Verma

Abstract:

This paper focuses on the mathematical modeling for solidification of Al alloy in a cube mould cavity to study the solidification behavior of casting process. The parametric investigation of solidification process inside the cavity was performed by using computational solidification/melting model coupled with Volume of fluid (VOF) model. The implicit filling algorithm is used in this study to understand the overall process from the filling stage to solidification in a model metal casting process. The model is validated with past studied at same conditions. The solidification process are analyzed by including the effect of pouring velocity and temperature of liquid metal, effect of wall temperature as well natural convection from the wall and geometry of the cavity. These studies show the possibility of various defects during solidification process.

Keywords: buoyancy driven flow, natural convection driven flow, residual flow, secondary flow, volume of fluid

Procedia PDF Downloads 410
6406 Gene Names Identity Recognition Using Siamese Network for Biomedical Publications

Authors: Micheal Olaolu Arowolo, Muhammad Azam, Fei He, Mihail Popescu, Dong Xu

Abstract:

As the quantity of biological articles rises, so does the number of biological route figures. Each route figure shows gene names and relationships. Annotating pathway diagrams manually is time-consuming. Advanced image understanding models could speed up curation, but they must be more precise. There is rich information in biological pathway figures. The first step to performing image understanding of these figures is to recognize gene names automatically. Classical optical character recognition methods have been employed for gene name recognition, but they are not optimized for literature mining data. This study devised a method to recognize an image bounding box of gene name as a photo using deep Siamese neural network models to outperform the existing methods using ResNet, DenseNet and Inception architectures, the results obtained about 84% accuracy.

Keywords: biological pathway, gene identification, object detection, Siamese network

Procedia PDF Downloads 269
6405 A 7 Dimensional-Quantitative Structure-Activity Relationship Approach Combining Quantum Mechanics Based Grid and Solvation Models to Predict Hotspots and Kinetic Properties of Mutated Enzymes: An Enzyme Engineering Perspective

Authors: R. Pravin Kumar, L. Roopa

Abstract:

Enzymes are molecular machines used in various industries such as pharmaceuticals, cosmetics, food and animal feed, paper and leather processing, biofuel, and etc. Nevertheless, this has been possible only by the breath-taking efforts of the chemists and biologists to evolve/engineer these mysterious biomolecules to work the needful. Main agenda of this enzyme engineering project is to derive screening and selection tools to obtain focused libraries of enzyme variants with desired qualities. The methodologies for this research include the well-established directed evolution, rational redesign and relatively less established yet much faster and accurate insilico methods. This concept was initiated as a Receptor Rependent-4Dimensional Quantitative Structure Activity Relationship (RD-4D-QSAR) to predict kinetic properties of enzymes and extended here to study transaminase by a 7D QSAR approach. Induced-fit scenarios were explored using Quantum Mechanics/Molecular Mechanics (QM/MM) simulations which were then placed in a grid that stores interactions energies derived from QM parameters (QMgrid). In this study, the mutated enzymes were immersed completely inside the QMgrid and this was combined with solvation models to predict descriptors. After statistical screening of descriptors, QSAR models showed > 90% specificity and > 85% sensitivity towards the experimental activity. Mapping descriptors on the enzyme structure revealed hotspots important to enhance the enantioselectivity of the enzyme.

Keywords: QMgrid, QM/MM simulations, RD-4D-QSAR, transaminase

Procedia PDF Downloads 128
6404 The Use of Thermal Infrared Wavelengths to Determine the Volcanic Soils

Authors: Levent Basayigit, Mert Dedeoglu, Fadime Ozogul

Abstract:

In this study, an application was carried out to determine the Volcanic Soils by using remote sensing.  The study area was located on the Golcuk formation in Isparta-Turkey. The thermal bands of Landsat 7 image were used for processing. The implementation of the climate model that was based on the water index was used in ERDAS Imagine software together with pixel based image classification. Soil Moisture Index (SMI) was modeled by using the surface temperature (Ts) which was obtained from thermal bands and vegetation index (NDVI) derived from Landsat 7. Surface moisture values were grouped and classified by using scoring system. Thematic layers were compared together with the field studies. Consequently, different moisture levels for volcanic soils were indicator for determination and separation. Those thermal wavelengths are preferable bands for separation of volcanic soils using moisture and temperature models.

Keywords: Landsat 7, soil moisture index, temperature models, volcanic soils

Procedia PDF Downloads 292
6403 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data

Procedia PDF Downloads 580
6402 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 382
6401 Building a Blockchain-based Internet of Things

Authors: Rob van den Dam

Abstract:

Today’s Internet of Things (IoT) comprises more than a billion intelligent devices, connected via wired/wireless communications. The expected proliferation of hundreds of billions more places us at the threshold of a transformation sweeping across the communications industry. Yet, we found that the IoT architecture and solutions that currently work for billions of devices won’t necessarily scale to tomorrow’s hundreds of billions of devices because of high cost, lack of privacy, not future-proof, lack of functional value and broken business models. As the IoT scales exponentially, decentralized networks have the potential to reduce infrastructure and maintenance costs to manufacturers. Decentralization also promises increased robustness by removing single points of failure that could exist in traditional centralized networks. By shifting the power in the network from the center to the edges, devices gain greater autonomy and can become points of transactions and economic value creation for owners and users. To validate the underlying technology vision, IBM jointly developed with Samsung Electronics the autonomous decentralized peer-to- peer proof-of-concept (PoC). The primary objective of this PoC was to establish a foundation on which to demonstrate several capabilities that are fundamental to building a decentralized IoT. Though many commercial systems in the future will exist as hybrid centralized-decentralized models, the PoC demonstrated a fully distributed proof. The PoC (a) validated the future vision for decentralized systems to extensively augment today’s centralized solutions, (b) demonstrated foundational IoT tasks without the use of centralized control, (c) proved that empowered devices can engage autonomously in marketplace transactions. The PoC opens the door for the communications and electronics industry to further explore the challenges and opportunities of potential hybrid models that can address the complexity and variety of requirements posed by the internet that continues to scale. Contents: (a) The new approach for an IoT that will be secure and scalable, (b) The three foundational technologies that are key for the future IoT, (c) The related business models and user experiences, (d) How such an IoT will create an 'Economy of Things', (e) The role of users, devices, and industries in the IoT future, (f) The winners in the IoT economy.

Keywords: IoT, internet, wired, wireless

Procedia PDF Downloads 327
6400 Comprehensive Study of Probability Distributions to Enhance Controllability Simulations to Introduce Autonomous Emergency Braking (AEBS) Feature in India

Authors: Nedunuri Kartheek, Mattupalli Chandra Sekhar

Abstract:

India is a diverse country in terms of road conditions, road maintenance, traffic conditions, traffic density, quality of traffic which implies presence of agricultural tractors, bullock carts, 3 wheelers, motor bikes, oncoming traffic in same lane, vulnerable road users (VRU’s) crossing roads without using pedestrian crossings etc. as additional traffic quality deterrents in comparison to developed countries. The driving pattern of such vivid road users may not be at par with global approximations adopted in developing features like AEBS (Autonomous Emergency Braking). For developing an entangled feature like AEBS for Indian traffic conditions, one must adapt different methodologies than to the conventions that exit as a global practice. The paper provides the reaction time and time gap data of Indian roads across various categories of vehicle. The paper deals with the mathematical approximations of different probability bivariant models to closely represent the data, which was acquired by collecting and analyzing data of random actual vehicle data on Indian roads. A case study to demonstrate the adoption of different probability models based on Monte Carlo simulations shall be provided to calculate the controllability by analyzing a better fit for the Indian road user driving pattern simulation.

Keywords: autonomous emergency braking, Monte Carlo simulations, probability bivariant models, vulnerable road users

Procedia PDF Downloads 13
6399 Induction Machine Design Method for Aerospace Starter/Generator Applications and Parametric FE Analysis

Authors: Wang Shuai, Su Rong, K. J.Tseng, V. Viswanathan, S. Ramakrishna

Abstract:

The More-Electric-Aircraft concept in aircraft industry levies an increasing demand on the embedded starter/generators (ESG). The high-speed and high-temperature environment within an engine poses great challenges to the operation of such machines. In view of such challenges, squirrel cage induction machines (SCIM) have shown advantages due to its simple rotor structure, absence of temperature-sensitive components as well as low torque ripples etc. The tight operation constraints arising from typical ESG applications together with the detailed operation principles of SCIMs have been exploited to derive the mathematical interpretation of the ESG-SCIM design process. The resultant non-linear mathematical treatment yielded unique solution to the SCIM design problem for each configuration of pole pair number p, slots/pole/phase q and conductors/slot zq, easily implemented via loop patterns. It was also found that not all configurations led to feasible solutions and corresponding observations have been elaborated. The developed mathematical procedures also proved an effective framework for optimization among electromagnetic, thermal and mechanical aspects by allocating corresponding degree-of-freedom variables. Detailed 3D FEM analysis has been conducted to validate the resultant machine performance against design specifications. To obtain higher power ratings, electrical machines often have to increase the slot areas for accommodating more windings. Since the available space for embedding such machines inside an engine is usually short in length, axial air gap arrangement appears more appealing compared to its radial gap counterpart. The aforementioned approach has been adopted in case studies of designing series of AFIMs and RFIMs respectively with increasing power ratings. Following observations have been obtained. Under the strict rotor diameter limitation AFIM extended axially for the increased slot areas while RFIM expanded radially with the same axial length. Beyond certain power ratings AFIM led to long cylinder geometry while RFIM topology resulted in the desired short disk shape. Besides the different dimension growth patterns, AFIMs and RFIMs also exhibited dissimilar performance degradations regarding power factor, torque ripples as well as rated slip along with increased power ratings. Parametric response curves were plotted to better illustrate the above influences from increased power ratings. The case studies may provide a basic guideline that could assist potential users in making decisions between AFIM and RFIM for relevant applications.

Keywords: axial flux induction machine, electrical starter/generator, finite element analysis, squirrel cage induction machine

Procedia PDF Downloads 448
6398 Forecasting Container Throughput: Using Aggregate or Terminal-Specific Data?

Authors: Gu Pang, Bartosz Gebka

Abstract:

We forecast the demand of total container throughput at the Indonesia’s largest seaport, Tanjung Priok Port. We propose four univariate forecasting models, including SARIMA, the additive Seasonal Holt-Winters, the multiplicative Seasonal Holt-Winters and the Vector Error Correction Model. Our aim is to provide insights into whether forecasting the total container throughput obtained by historical aggregated port throughput time series is superior to the forecasts of the total throughput obtained by summing up the best individual terminal forecasts. We test the monthly port/individual terminal container throughput time series between 2003 and 2013. The performance of forecasting models is evaluated based on Mean Absolute Error and Root Mean Squared Error. Our results show that the multiplicative Seasonal Holt-Winters model produces the most accurate forecasts of total container throughput, whereas SARIMA generates the worst in-sample model fit. The Vector Error Correction Model provides the best model fits and forecasts for individual terminals. Our results report that the total container throughput forecasts based on modelling the total throughput time series are consistently better than those obtained by combining those forecasts generated by terminal-specific models. The forecasts of total throughput until the end of 2018 provide an essential insight into the strategic decision-making on the expansion of port's capacity and construction of new container terminals at Tanjung Priok Port.

Keywords: SARIMA, Seasonal Holt-Winters, Vector Error Correction Model, container throughput

Procedia PDF Downloads 494
6397 Predicting Resistance of Commonly Used Antimicrobials in Urinary Tract Infections: A Decision Tree Analysis

Authors: Meera Tandan, Mohan Timilsina, Martin Cormican, Akke Vellinga

Abstract:

Background: In general practice, many infections are treated empirically without microbiological confirmation. Understanding susceptibility of antimicrobials during empirical prescribing can be helpful to reduce inappropriate prescribing. This study aims to apply a prediction model using a decision tree approach to predict the antimicrobial resistance (AMR) of urinary tract infections (UTI) based on non-clinical features of patients over 65 years. Decision tree models are a novel idea to predict the outcome of AMR at an initial stage. Method: Data was extracted from the database of the microbiological laboratory of the University Hospitals Galway on all antimicrobial susceptibility testing (AST) of urine specimens from patients over the age of 65 from January 2011 to December 2014. The primary endpoint was resistance to common antimicrobials (Nitrofurantoin, trimethoprim, ciprofloxacin, co-amoxiclav and amoxicillin) used to treat UTI. A classification and regression tree (CART) model was generated with the outcome ‘resistant infection’. The importance of each predictor (the number of previous samples, age, gender, location (nursing home, hospital, community) and causative agent) on antimicrobial resistance was estimated. Sensitivity, specificity, negative predictive (NPV) and positive predictive (PPV) values were used to evaluate the performance of the model. Seventy-five percent (75%) of the data were used as a training set and validation of the model was performed with the remaining 25% of the dataset. Results: A total of 9805 UTI patients over 65 years had their urine sample submitted for AST at least once over the four years. E.coli, Klebsiella, Proteus species were the most commonly identified pathogens among the UTI patients without catheter whereas Sertia, Staphylococcus aureus; Enterobacter was common with the catheter. The validated CART model shows slight differences in the sensitivity, specificity, PPV and NPV in between the models with and without the causative organisms. The sensitivity, specificity, PPV and NPV for the model with non-clinical predictors was between 74% and 88% depending on the antimicrobial. Conclusion: The CART models developed using non-clinical predictors have good performance when predicting antimicrobial resistance. These models predict which antimicrobial may be the most appropriate based on non-clinical factors. Other CART models, prospective data collection and validation and an increasing number of non-clinical factors will improve model performance. The presented model provides an alternative approach to decision making on antimicrobial prescribing for UTIs in older patients.

Keywords: antimicrobial resistance, urinary tract infection, prediction, decision tree

Procedia PDF Downloads 245
6396 Variability Management of Contextual Feature Model in Multi-Software Product Line

Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz

Abstract:

Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.

Keywords: software product line, feature model, variability management, multi-SPLs

Procedia PDF Downloads 58
6395 Modeling Football Penalty Shootouts: How Improving Individual Performance Affects Team Performance and the Fairness of the ABAB Sequence

Authors: Pablo Enrique Sartor Del Giudice

Abstract:

Penalty shootouts often decide the outcome of important soccer matches. Although usually referred to as ”lotteries”, there is evidence that some national teams and clubs consistently perform better than others. The outcomes are therefore not explained just by mere luck, and therefore there are ways to improve the average performance of players, naturally at the expense of some sort of effort. In this article we study the payoff of player performance improvements in terms of the performance of the team as a whole. To do so we develop an analytical model with static individual performances, as well as Monte Carlo models that take into account the known influence of partial score and round number on individual performances. We find that within a range of usual values, the team performance improves above 70% faster than individual performances do. Using these models, we also estimate that the new ABBA penalty shootout ordering under test reduces almost all the known bias in favor of the first-shooting team under the current ABAB system.

Keywords: football, penalty shootouts, Montecarlo simulation, ABBA

Procedia PDF Downloads 152
6394 Revolving Ferrofluid Flow in Porous Medium with Rotating Disk

Authors: Paras Ram, Vikas Kumar

Abstract:

The transmission of Malaria with seasonal were studied through the use of mathematical models. The data from the annual number of Malaria cases reported to the Division of Epidemiology, Ministry of Public Health, Thailand during the period 1997-2011 were analyzed. The transmission of Malaria with seasonal was studied by formulating a mathematical model which had been modified to describe different situations encountered in the transmission of Malaria. In our model, the population was separated into two groups: the human and vector groups, and then constructed a system of nonlinear differential equations. Each human group was divided into susceptible, infectious in hot season, infectious in rainy season, infectious in cool season and recovered classes. The vector population was separated into two classes only: susceptible and infectious vectors. The analysis of the models was given by the standard dynamical modeling.

Keywords: ferrofluid, magnetic field, porous medium, rotating disk, Neuringer-Rosensweig Model

Procedia PDF Downloads 411
6393 Contextual Toxicity Detection with Data Augmentation

Authors: Julia Ive, Lucia Specia

Abstract:

Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.

Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing

Procedia PDF Downloads 159
6392 Emancipation through the Inclusion of Civil Society in Contemporary Peacebuilding: A Case Study of Peacebuilding Efforts in Colombia

Authors: D. Romero Espitia

Abstract:

Research on peacebuilding has taken a critical turn into examining the neoliberal and hegemonic conception of peace operations. Alternative peacebuilding models have been analyzed, but the scholarly discussion fails to bring them together or form connections between them. The objective of this paper is to rethink peacebuilding by extracting the positive aspects of the various peacebuilding models, connecting them with the local context, and therefore promote emancipation in contemporary peacebuilding efforts. Moreover, local ownership has been widely labelled as one, if not the core principle necessary for a successful peacebuilding project. Yet, definitions of what constitutes the 'local' remain debated. Through a qualitative review of literature, this paper unpacks the contemporary conception of peacebuilding in nexus with 'local ownership' as manifested through civil society. Using Colombia as a case study, this paper argues that a new peacebuilding framework, one that reconsiders the terms of engagement between international and national actors, is needed in order to foster effective peacebuilding efforts in contested transitional states.

Keywords: civil society, Colombia, emancipation, peacebuilding

Procedia PDF Downloads 124