Search results for: multi-scale computational modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3719

Search results for: multi-scale computational modelling

1589 Coronavirus Anxiety and Job Burnout of Polish Front-Line Health-Care Workers. Mediation Effect of Insomnia

Authors: Lukasz Baka

Abstract:

Objective. The study aimed to investigate the direct and indirect - mediated through insomnia - effect of coronavirus anxiety on exhaustion from the perspective of Hobfol Conservation of Resources (COR) theory. According to COR theory, critical events (e.g. the coronavirus epidemic) make people fearful of losing their valuable resources. A prolonged state of anxiety may lead to sleep troubles, which over time, results in an increase in exhaustion. Materials and Methods: Data were collected among 440 Polish healthcare providers, including nurses and midwives, doctors, paramedics, medical assistance, and wardens. Three measurements were used: Coronavirus Anxiety Scale (CAS), Copenhagen Psychosocial Questionnaire (COPSOQ, sleep trouble subscale) and Oldenburg Burnout Inventory (OLBI, exhaustion subscale). Hypotheses were tested by the use of Structural Equation Modelling (SEM). Results: The obtained results fully support the hypotheses. Both the direct and indirect relationships between coronavirus anxiety and exhaustion were observed. Specifically, high coronavirus anxiety increased insomnia, which in turn contributed to the development of exhaustion. Conclusion: The results are consistent with the COR theory. Prolonged coronavirus anxiety and sleep problems depleted healthcare providers’ resources and made them feel exhausted. Exhaustion among these workers can have serious consequences not only for themselves but also for the health of their patients, therefore researches into effective ways to deal with coronavirus anxiety are needed.

Keywords: coronavirus anxiety, front-line healt-care workers, insomnia, job burnout

Procedia PDF Downloads 182
1588 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 225
1587 Risk Analysis in Off-Site Construction Manufacturing in Small to Medium-Sized Projects

Authors: Atousa Khodadadyan, Ali Rostami

Abstract:

The objective of off-site construction manufacturing is to utilise the workforce and machinery in a controlled environment without external interference for higher productivity and quality. The usage of prefabricated components can save up to 14% of the total energy consumption in comparison with the equivalent number of cast-in-place ones. Despite the benefits of prefabrication construction, its current project practices encompass technical and managerial issues. Building design, precast components’ production, logistics, and prefabrication installation processes are still mostly discontinued and fragmented. Furthermore, collaboration among prefabrication manufacturers, transportation parties, and on-site assemblers rely on real-time information such as the status of precast components, delivery progress, and the location of components. From the technical point of view, in this industry, geometric variability is still prevalent, which can be caused during the transportation or production of components. These issues indicate that there are still many aspects of prefabricated construction that can be developed using disruptive technologies. Practical real-time risk analysis can be used to address these issues as well as the management of safety, quality, and construction environment issues. On the other hand, the lack of research about risk assessment and the absence of standards and tools hinder risk management modeling in prefabricated construction. It is essential to note that no risk management standard has been established explicitly for prefabricated construction projects, and most software packages do not provide tailor-made functions for this type of projects.

Keywords: project risk management, risk analysis, risk modelling, prefabricated construction projects

Procedia PDF Downloads 171
1586 Conceptual Design of Gravity Anchor Focusing on Anchor Towing and Lowering

Authors: Vinay Kumar Vanjakula, Frank Adam, Nils Goseberg

Abstract:

Wind power is one of the leading renewable energy generation methods. Due to abundant higher wind speeds far away from shore, the construction of offshore wind turbines began in the last decades. However, installation of offshore foundation-based (monopiles) wind turbines in deep waters are often associated with technical and financial challenges. To overcome such challenges, the concept of floating wind turbines is expanded as the basis from the oil and gas industry. The unfolding of Universal heavyweight gravity anchor (UGA) for floating based foundation for floating Tension Leg Platform (TLP) sub-structures is developed in this research work. It is funded by the German Federal Ministry of Education and Research) for a three-year (2019-2022) research program called “Offshore Wind Solutions Plus (OWSplus) - Floating Offshore Wind Solutions Mecklenburg-Vorpommern.” It’s a group consists of German institutions (Universities, laboratories, and consulting companies). The part of the project is focused on the numerical modeling of gravity anchor that involves to analyze and solve fluid flow problems. Compared to gravity-based torpedo anchors, these UGA will be towed and lowered via controlled machines (tug boats) at lower speeds. This kind of installation of UGA are new to the offshore wind industry, particularly for TLP, and very few research works have been carried out in recent years. Conventional methods for transporting the anchor requires a large transportation crane vessel which involves a greater cost. This conceptual UGA anchors consists of ballasting chambers which utilizes the concept of buoyancy forces; the inside chambers are filled with the required amount of water in a way that they can float on the water for towing. After reaching the installation site, those chambers are ballasted with water for lowering. After it’s lifetime, these UGA can be unballasted (for erection or replacement) results in self-rising to the sea surface; buoyancy chambers give an advantage for using an UGA without the need of heavy machinery. However, while lowering/rising the UGA towards/away from the seabed, it experiences difficult, harsh marine environments due to the interaction of waves and currents. This leads to drifting of the anchor from the desired installation position and damage to the lowering machines. To overcome such harsh environments problems, a numerical model is built to investigate the influences of different outer contours and other fluid governing shapes that can be installed on the UGA to overcome the turbulence and drifting. The presentation will highlight the importance of the Computational Fluid Dynamics (CFD) numerical model in OpenFOAM, which is open-source programming software.

Keywords: anchor lowering, towing, waves, currrents, computational fluid dynamics

Procedia PDF Downloads 165
1585 Energy Recovery Potential from Food Waste and Yard Waste in New York and Montréal

Authors: T. Malmir, U. Eicker

Abstract:

Landfilling of organic waste is still the predominant waste management method in the USA and Canada. Strategic plans for waste diversion from landfills are needed to increase material recovery and energy generation from waste. In this paper, we carried out a statistical survey on waste flow in the two cities New York and Montréal and estimated the energy recovery potential for each case. Data collection and analysis of the organic waste (food waste, yard waste, etc.), paper and cardboard, metal, glass, plastic, carton, textile, electronic products and other materials were done based on the reports published by the Department of Sanitation in New York and Service de l'Environnement in Montréal. In order to calculate the gas generation potential of organic waste, Buswell equation was used in which the molar mass of the elements was calculated based on their atomic weight and the amount of organic waste in New York and Montréal. Also, the higher and lower calorific value of the organic waste (solid base) and biogas (gas base) were calculated. According to the results, only 19% (598 kt) and 45% (415 kt) of New York and Montréal waste were diverted from landfills in 2017, respectively. The biogas generation potential of the generated food waste and yard waste amounted to 631 million m3 in New York and 173 million m3 in Montréal. The higher and lower calorific value of food waste were 3482 and 2792 GWh in New York and 441 and 354 GWh in Montréal, respectively. In case of yard waste, they were 816 and 681 GWh in New York and 636 and 531 GWh in Montréal, respectively. Considering the higher calorific value, this amount would mean a contribution of around 2.5% energy in these cities.

Keywords: energy recovery, organic waste, urban energy modelling with INSEL, waste flow

Procedia PDF Downloads 135
1584 A Discrete Element Method Centrifuge Model of Monopile under Cyclic Lateral Loads

Authors: Nuo Duan, Yi Pik Cheng

Abstract:

This paper presents the data of a series of two-dimensional Discrete Element Method (DEM) simulations of a large-diameter rigid monopile subjected to cyclic loading under a high gravitational force. At present, monopile foundations are widely used to support the tall and heavy wind turbines, which are also subjected to significant from wind and wave actions. A safe design must address issues such as rotations and changes in soil stiffness subject to these loadings conditions. Design guidance on the issue is limited, so are the availability of laboratory and field test data. The interpretation of these results in sand, such as the relation between loading and displacement, relies mainly on empirical correlations to pile properties. Regarding numerical models, most data from Finite Element Method (FEM) can be found. They are not comprehensive, and most of the FEM results are sensitive to input parameters. The micro scale behaviour could change the mechanism of the soil-structure interaction. A DEM model was used in this paper to study the cyclic lateral loads behaviour. A non-dimensional framework is presented and applied to interpret the simulation results. The DEM data compares well with various set of published experimental centrifuge model test data in terms of lateral deflection. The accumulated permanent pile lateral displacements induced by the cyclic lateral loads were found to be dependent on the characteristics of the applied cyclic load, such as the extent of the loading magnitudes and directions.

Keywords: cyclic loading, DEM, numerical modelling, sands

Procedia PDF Downloads 319
1583 An Optimization Model for Maximum Clique Problem Based on Semidefinite Programming

Authors: Derkaoui Orkia, Lehireche Ahmed

Abstract:

The topic of this article is to exploring the potentialities of a powerful optimization technique, namely Semidefinite Programming, for solving NP-hard problems. This approach provides tight relaxations of combinatorial and quadratic problems. In this work, we solve the maximum clique problem using this relaxation. The clique problem is the computational problem of finding cliques in a graph. It is widely acknowledged for its many applications in real-world problems. The numerical results show that it is possible to find a maximum clique in polynomial time, using an algorithm based on semidefinite programming. We implement a primal-dual interior points algorithm to solve this problem based on semidefinite programming. The semidefinite relaxation of this problem can be solved in polynomial time.

Keywords: semidefinite programming, maximum clique problem, primal-dual interior point method, relaxation

Procedia PDF Downloads 219
1582 Modelling the Impacts of Geophysical Parameters on Deforestation and Forest Degradation in Pre and Post Ban Logging Periods in Hindu Kush Himalayas

Authors: Alam Zeb, Glen W. Armstrong, Muhammad Qasim

Abstract:

Loss of forest cover is one of the most important land cover changes and has been of great concern to policy makers. This study quantified forest cover changes over pre logging ban (1973-1993) and post logging ban (1993-2015) to examine the role of geophysical factors and spatial attributes of land in the two periods. We show that despite a complete ban on green felling, forest cover decreased by 28% and mostly converted to rangeland. Nevertheless, the logging ban was completely effective in controlling agriculture expansion. The binary logistic regression revealed that the south facing aspects at low elevation witnessed more deforestation in the pre-ban period compared to post-ban. Opposite to deforestation, forest degradation was more prominent on the northern aspects at higher elevation during the policy period. Agriculture expansion was widespread in the low elevation flat areas with gentle slope, while during the policy period agriculture contraction in the form of regeneration was observed on the low elevation areas of north facing slopes. All proximity variables, except distance to administrative boundary, showed a similar trend across the two periods and were important explanatory variables in understanding forest and agriculture expansion. The changes in determinants of forest and agriculture expansion and contraction over the two periods might be attributed to the influence of policy and a general decrease in resource availability.

Keywords: forest conservation , wood harvesting ban, logistic regression, deforestation, forest degradation, agriculture expansion, Chitral, Pakistan

Procedia PDF Downloads 228
1581 Integer Programming Model for the Network Design Problem with Facility Dependent Shortest Path Routing

Authors: Taehan Lee

Abstract:

We consider a network design problem which has shortest routing restriction based on the values determined by the installed facilities on each arc. In conventional multicommodity network design problem, a commodity can be routed through any possible path when the capacity is available. But, we consider a problem in which the commodity between two nodes must be routed on a path which has shortest metric value and the link metric value is determined by the installed facilities on the link. By this routing restriction, the problem has a distinct characteristic. We present an integer programming formulation containing the primal-dual optimality conditions to the shortest path routing. We give some computational results for the model.

Keywords: integer programming, multicommodity network design, routing, shortest path

Procedia PDF Downloads 418
1580 An Improved Genetic Algorithm for Traveling Salesman Problem with Precedence Constraint

Authors: M. F. F. Ab Rashid, A. N. Mohd Rose, N. M. Z. Nik Mohamed, W. S. Wan Harun, S. A. Che Ghani

Abstract:

Traveling salesman problem with precedence constraint (TSPPC) is one of the most complex problems in combinatorial optimization. The existing algorithms to solve TSPPC cost large computational time to find the optimal solution. The purpose of this paper is to present an efficient genetic algorithm that guarantees optimal solution with less number of generations and iterations time. Unlike the existing algorithm that generates priority factor as chromosome, the proposed algorithm directly generates sequence of solution as chromosome. As a result, the proposed algorithm is capable of generating optimal solution with smaller number of generations and iteration time compare to existing algorithm.

Keywords: traveling salesman problem, sequencing, genetic algorithm, precedence constraint

Procedia PDF Downloads 557
1579 An Investigation of the Relationship Between Privacy Crisis, Public Discourse on Privacy, and Key Performance Indicators at Facebook (2004–2021)

Authors: Prajwal Eachempati, Laurent Muzellec, Ashish Kumar Jha

Abstract:

We use Facebook as a case study to investigate the complex relationship between the firm’s public discourse (and actions) surrounding data privacy and the performance of a business model based on monetizing user’s data. We do so by looking at the evolution of public discourse over time (2004–2021) and relate topics to revenue and stock market evolution Drawing from archival sources like Zuckerberg We use LDA topic modelling algorithm to reveal 19 topics regrouped in 6 major themes. We first show how, by using persuasive and convincing language that promises better protection of consumer data usage, but also emphasizes greater user control over their own data, the privacy issue is being reframed as one of greater user control and responsibility. Second, we aim to understand and put a value on the extent to which privacy disclosures have a potential impact on the financial performance of social media firms. There we found significant relationship between the topics pertaining to privacy and social media/technology, sentiment score and stock market prices. Revenue is found to be impacted by topics pertaining to politics and new product and service innovations while number of active users is not impacted by the topics unless moderated by external control variables like Return on Assets and Brand Equity.

Keywords: public discourses, data protection, social media, privacy, topic modeling, business models, financial performance

Procedia PDF Downloads 92
1578 Review of Theories and Applications of Genetic Programing in Sediment Yield Modeling

Authors: Adesoji Tunbosun Jaiyeola, Josiah Adeyemo

Abstract:

Sediment yield can be considered to be the total sediment load that leaves a drainage basin. The knowledge of the quantity of sediments present in a river at a particular time can lead to better flood capacity in reservoirs and consequently help to control over-bane flooding. Furthermore, as sediment accumulates in the reservoir, it gradually loses its ability to store water for the purposes for which it was built. The development of hydrological models to forecast the quantity of sediment present in a reservoir helps planners and managers of water resources systems, to understand the system better in terms of its problems and alternative ways to address them. The application of artificial intelligence models and technique to such real-life situations have proven to be an effective approach of solving complex problems. This paper makes an extensive review of literature relevant to the theories and applications of evolutionary algorithms, and most especially genetic programming. The successful applications of genetic programming as a soft computing technique were reviewed in sediment modelling and other branches of knowledge. Some fundamental issues such as benchmark, generalization ability, bloat and over-fitting and other open issues relating to the working principles of GP, which needs to be addressed by the GP community were also highlighted. This review aim to give GP theoreticians, researchers and the general community of GP enough research direction, valuable guide and also keep all stakeholders abreast of the issues which need attention during the next decade for the advancement of GP.

Keywords: benchmark, bloat, generalization, genetic programming, over-fitting, sediment yield

Procedia PDF Downloads 446
1577 Reliability Analysis of Dam under Quicksand Condition

Authors: Manthan Patel, Vinit Ahlawat, Anshh Singh Claire, Pijush Samui

Abstract:

This paper focuses on the analysis of quicksand condition for a dam foundation. The quicksand condition occurs in cohesion less soil when effective stress of soil becomes zero. In a dam, the saturated sediment may appear quite solid until a sudden change in pressure or shock initiates liquefaction. This causes the sand to form a suspension and lose strength hence resulting in failure of dam. A soil profile shows different properties at different points and the values obtained are uncertain thus reliability analysis is performed. The reliability is defined as probability of safety of a system in a given environment and loading condition and it is assessed as Reliability Index. The reliability analysis of dams under quicksand condition is carried by Gaussian Process Regression (GPR). Reliability index and factor of safety relating to liquefaction of soil is analysed using GPR. The results of reliability analysis by GPR is compared to that of conventional method and it is demonstrated that on applying GPR the probabilistic analysis reduces the computational time and efforts.

Keywords: factor of safety, GPR, reliability index, quicksand

Procedia PDF Downloads 480
1576 Numerical Investigation of Improved Aerodynamic Performance of a NACA 0015 Airfoil Using Synthetic Jet

Authors: K. Boualem, T. Yahiaoui, A. Azzi

Abstract:

Numerical investigations are performed to analyze the flow behavior over NACA0015 and to evaluate the efficiency of synthetic jet as active control device. The second objective of this work is to investigate the influence of momentum coefficient of synthetic jet on the flow behaviour. The unsteady Reynolds-averaged Navier-Stokes equations of the turbulent flow are solved using, k-ω SST provided by ANSYS CFX-CFD code. The model presented in this paper is a comprehensive representation of the information found in the literature. Comparison of obtained numerical flow parameters with the experimental ones shows that the adopted computational procedure reflects nearly the real flow nature. Also, numerical results state that use of synthetic jets devices has positive effects on the flow separation, and thus, aerodynamic performance improvement of NACA0015 airfoil. It can also be observed that the use of synthetic jet increases the lift coefficient about 13.3% and reduces the drag coefficient about 52.7%.

Keywords: active control, synthetic jet, NACA airfoil, CFD

Procedia PDF Downloads 311
1575 Evaluating Mechanical Properties of CoNiCrAlY Coating from Miniature Specimen Testing at Elevated Temperature

Authors: W. Wen, G. Jackson, S. Maskill, D. G. McCartney, W. Sun

Abstract:

CoNiCrAlY alloys have been widely used as bond coats for thermal barrier coating (TBC) systems because of low cost, improved control of composition, and the feasibility to tailor the coatings microstructures. Coatings are in general very thin structures, and therefore it is impossible to characterize the mechanical responses of the materials via conventional mechanical testing methods. Due to this reason, miniature specimen testing methods, such as the small punch test technique, have been developed. This paper presents some of the recent research in evaluating the mechanical properties of the CoNiCrAlY coatings at room and high temperatures, through the use of small punch testing and the developed miniature specimen tensile testing, applicable to a range of temperature, to investigate the elastic-plastic and creep behavior as well as ductile-brittle transition temperature (DBTT) behavior. An inverse procedure was developed to derive the mechanical properties from such tests for the coating materials. A two-layer specimen test method is also described. The key findings include: 1) the temperature-dependent coating properties can be accurately determined by the miniature tensile testing within a wide range of temperature; 2) consistent DBTTs can be identified by both the SPT and miniature tensile tests (~ 650 °C); and 3) the FE SPT modelling has shown good capability of simulating the early local cracking. In general, the temperature-dependent material behaviors of the CoNiCrAlY coating has been effectively characterized using miniature specimen testing and inverse method.

Keywords: NiCoCrAlY coatings, mechanical properties, DBTT, miniature specimen testing

Procedia PDF Downloads 168
1574 Knowledge Based Behaviour Modelling and Execution in Service Robotics

Authors: Suraj Nair, Aravindkumar Vijayalingam, Alexander Perzylo, Alois Knoll

Abstract:

In the last decade robotics research and development activities have grown rapidly, especially in the domain of service robotics. Integrating service robots into human occupied spaces such as homes, offices, hospitals, etc. has become increasingly worked upon. The primary motive is to ease daily lives of humans by taking over some of the household/office chores. However, several challenges remain in systematically integrating such systems in human shared work-spaces. In addition to sensing and indoor-navigation challenges, programmability of such systems is a major hurdle due to the fact that the potential user cannot be expected to have knowledge in robotics or similar mechatronic systems. In this paper, we propose a cognitive system for service robotics which allows non-expert users to easily model system behaviour in an underspecified manner through abstract tasks and objects associated with them. The system uses domain knowledge expressed in the form of an ontology along with logical reasoning mechanisms to infer all the missing pieces of information required for executing the tasks. Furthermore, the system is also capable of recovering from failed tasks arising due to on-line disturbances by using the knowledge base and inferring alternate methods to execute the same tasks. The system is demonstrated through a coffee fetching scenario in an office environment using a mobile robot equipped with sensors and software capabilities for autonomous navigation and human-interaction through natural language.

Keywords: cognitive robotics, reasoning, service robotics, task based systems

Procedia PDF Downloads 242
1573 A Robust Software for Advanced Analysis of Space Steel Frames

Authors: Viet-Hung Truong, Seung-Eock Kim

Abstract:

This paper presents a robust software package for practical advanced analysis of space steel framed structures. The pre- and post-processors of the presented software package are coded in the C++ programming language while the solver is written by using the FORTRAN programming language. A user-friendly graphical interface of the presented software is developed to facilitate the modeling process and result interpretation of the problem. The solver employs the stability functions for capturing the second-order effects to minimize modeling and computational time. Both the plastic-hinge and fiber-hinge beam-column elements are available in the presented software. The generalized displacement control method is adopted to solve the nonlinear equilibrium equations.

Keywords: advanced analysis, beam-column, fiber-hinge, plastic hinge, steel frame

Procedia PDF Downloads 305
1572 Modelling the Tensile Behavior of Plasma Sprayed Freestanding Yttria Stabilized Zirconia Coatings

Authors: Supriya Patibanda, Xiaopeng Gong, Krishna N. Jonnalagadda, Ralph Abrahams

Abstract:

Yttria stabilized zirconia (YSZ) is used as a top coat in thermal barrier coatings in high-temperature turbine/jet engine applications. The mechanical behaviour of YSZ depends on the microstructural features like crack density and porosity, which are a result of coating method. However, experimentally ascertaining their individual effect is difficult due to the inherent challenges involved like material synthesis and handling. The current work deals with the development of a phenomenological model to replicate the tensile behavior of air plasma sprayed YSZ obtained from experiments. Initially, uniaxial tensile experiments were performed on freestanding YSZ coatings of ~300 µm thick for different crack densities and porosities. The coatings exhibited a nonlinear behavior and also a huge variation in strength values. With the obtained experimental tensile curve as a base and crack density and porosity as prime variables, a phenomenological model was developed using ABAQUS interface with new user material defined employing VUMAT sub routine. The relation between the tensile stress and the crack density was empirically established. Further, a parametric study was carried out to investigate the effect of the individual features on the non-linearity in these coatings. This work enables to generate new coating designs by varying the key parameters and predicting the mechanical properties with the help of a simulation, thereby minimizing experiments.

Keywords: crack density, finite element method, plasma sprayed coatings, VUMAT

Procedia PDF Downloads 147
1571 Evaluation of Quasi-Newton Strategy for Algorithmic Acceleration

Authors: T. Martini, J. M. Martínez

Abstract:

An algorithmic acceleration strategy based on quasi-Newton (or secant) methods is displayed for address the practical problem of accelerating the convergence of the Newton-Lagrange method in the case of convergence to critical multipliers. Since the Newton-Lagrange iteration converges locally at a linear rate, it is natural to conjecture that quasi-Newton methods based on the so called secant equation and some minimal variation principle, could converge superlinearly, thus restoring the convergence properties of Newton's method. This strategy can also be applied to accelerate the convergence of algorithms applied to fixed-points problems. Computational experience is reported illustrating the efficiency of this strategy to solve fixed-point problems with linear convergence rate.

Keywords: algorithmic acceleration, fixed-point problems, nonlinear programming, quasi-newton method

Procedia PDF Downloads 487
1570 Informing the Implementation of Career Conversations in Secondary Schools for the Building of Student Career Competencies: The Case of Portugal

Authors: Cristina Isabrl de Oliveira SAntos

Abstract:

The study aims to investigate how transferrable and effective career conversations could be, in the context of general track Portuguese secondary schools, with the view of improving students’ career competencies. It does so by analysing: 1) the extent to which students’ perceptions of career conversations relate with their existing career competencies, 2) the extent to which each of the parameters; perceptions of career conversations and student career competencies, relate with student situational and personal characteristics, 3) how patterns in perceptions of headteachers and of teachers at a school, regarding the implementation of career conversations, correlate to the views of students regarding career conversations and to school contextual characteristics. Data were collected from 27 secondary schools out of 32 in the same district of Aveiro, in Portugal. Interviews were performed individually, with 27 headteachers, and in groups, with a total of 10 teacher groups and 11 student groups. Survey responses were also collected from742 studentsand 310 teachers. Interview responses were coded and analysed using grounded theory principles. Data from questionnaires is currently beingscrutinised through descriptive statistics with SPSS, and Structural Equation Modelling (SEM). Triangulation during different stages of data analysis uses the principles of retroduction and abduction of the realist evaluation framework. Conclusions from the pilot-study indicate that student perceptions scores on content and relationship in career conversations change according to their career competencies and the type of school. Statistically significant differences in perceptions of career conversations were found for subgroups based on gender and parent educational level.

Keywords: career conversations, career competencies, secondary education, teachers

Procedia PDF Downloads 140
1569 Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

Authors: Lana Dalawr Jalal

Abstract:

This paper addresses the problem of offline path planning for Unmanned Aerial Vehicles (UAVs) in complex three-dimensional environment with obstacles, which is modelled by 3D Cartesian grid system. Path planning for UAVs require the computational intelligence methods to move aerial vehicles along the flight path effectively to target while avoiding obstacles. In this paper Modified Particle Swarm Optimization (MPSO) algorithm is applied to generate the optimal collision free 3D flight path for UAV. The simulations results clearly demonstrate effectiveness of the proposed algorithm in guiding UAV to the final destination by providing optimal feasible path quickly and effectively.

Keywords: obstacle avoidance, particle swarm optimization, three-dimensional path planning unmanned aerial vehicles

Procedia PDF Downloads 408
1568 Simulation of Glass Breakage Using Voronoi Random Field Tessellations

Authors: Michael A. Kraus, Navid Pourmoghaddam, Martin Botz, Jens Schneider, Geralt Siebert

Abstract:

Fragmentation analysis of tempered glass gives insight into the quality of the tempering process and defines a certain degree of safety as well. Different standard such as the European EN 12150-1 or the American ASTM C 1048/CPSC 16 CFR 1201 define a minimum number of fragments required for soda-lime safety glass on the basis of fragmentation test results for classification. This work presents an approach for the glass breakage pattern prediction using a Voronoi Tesselation over Random Fields. The random Voronoi tessellation is trained with and validated against data from several breakage patterns. The fragments in observation areas of 50 mm x 50 mm were used for training and validation. All glass specimen used in this study were commercially available soda-lime glasses at three different thicknesses levels of 4 mm, 8 mm and 12 mm. The results of this work form a Bayesian framework for the training and prediction of breakage patterns of tempered soda-lime glass using a Voronoi Random Field Tesselation. Uncertainties occurring in this process can be well quantified, and several statistical measures of the pattern can be preservation with this method. Within this work it was found, that different Random Fields as basis for the Voronoi Tesselation lead to differently well fitted statistical properties of the glass breakage patterns. As the methodology is derived and kept general, the framework could be also applied to other random tesselations and crack pattern modelling purposes.

Keywords: glass breakage predicition, Voronoi Random Field Tessellation, fragmentation analysis, Bayesian parameter identification

Procedia PDF Downloads 159
1567 Bayesian Flexibility Modelling of the Conditional Autoregressive Prior in a Disease Mapping Model

Authors: Davies Obaromi, Qin Yongsong, James Ndege, Azeez Adeboye, Akinwumi Odeyemi

Abstract:

The basic model usually used in disease mapping, is the Besag, York and Mollie (BYM) model and which combines the spatially structured and spatially unstructured priors as random effects. Bayesian Conditional Autoregressive (CAR) model is a disease mapping method that is commonly used for smoothening the relative risk of any disease as used in the Besag, York and Mollie (BYM) model. This model (CAR), which is also usually assigned as a prior to one of the spatial random effects in the BYM model, successfully uses information from adjacent sites to improve estimates for individual sites. To our knowledge, there are some unrealistic or counter-intuitive consequences on the posterior covariance matrix of the CAR prior for the spatial random effects. In the conventional BYM (Besag, York and Mollie) model, the spatially structured and the unstructured random components cannot be seen independently, and which challenges the prior definitions for the hyperparameters of the two random effects. Therefore, the main objective of this study is to construct and utilize an extended Bayesian spatial CAR model for studying tuberculosis patterns in the Eastern Cape Province of South Africa, and then compare for flexibility with some existing CAR models. The results of the study revealed the flexibility and robustness of this alternative extended CAR to the commonly used CAR models by comparison, using the deviance information criteria. The extended Bayesian spatial CAR model is proved to be a useful and robust tool for disease modeling and as a prior for the structured spatial random effects because of the inclusion of an extra hyperparameter.

Keywords: Besag2, CAR models, disease mapping, INLA, spatial models

Procedia PDF Downloads 277
1566 Spin-Polarized Structural, Electronic and Magnetic Properties of Intermetallic Dy2Ni2Pb from Computational Study

Authors: O. Arbouche, Y. Benallou, K. Amara

Abstract:

We report a first-principles study of structural, electronic and magnetic properties of ternary plumbides (rare earth-transition metal-Plumb) Dy2Ni2Pb crystallizes with the orthorhombic structure of the Mn2AlB2 type (space group Cmmm), were studied by means of the full-relativistic version of the full-potential augmented plane wave plus local orbital method within the frame work of spin-polarized density functional theory (SP-DFT). The electronic exchange-correlation energy is described by generalized gradient approximation (GGA). We have calculated the lattice parameters, bulk modulii and the first pressure derivatives of the bulk modulii, total densities of states and magnetic properties. The calculated total magnetic moment is found to be equal to 9.52 μB.

Keywords: spin-polarized, magnetic properties, Dy2Ni2Pb, Density functional theory

Procedia PDF Downloads 300
1565 Impact of the African Continental Free Trade Area on Ghana: A Computable General Equilibrium Approach

Authors: Gordon Newlove Asamoah

Abstract:

This study’s objective is to determine the impact of the African Continental Free Trade Area (AfCFTA) on Ghana using computable general equilibrium (CGE) modelling. The trade data for the simulation was drawn from the standard GTAP database version 10. The study estimated the Ad valorem equivalent (AVE) of Non-Tariff Measures (NTMs) for the Ghanaian sectors which were used for the analysis. Simulations were performed to remove import tariffs and export taxes for 90% of the tariff lines as well as 50% of the NTMs for all the AfCFTA participating countries. The NTMs' reduction was simulated using these two mechanisms: iceberg costs, also known as import augmenting technological change (AMS), and exporter costs (AXS). The study finds that removing the tariffs and NTMs in the AfCFTA regions has a positive impact on Ghana’s GDP, export and import volumes, terms of trade and welfare as measured by the equivalent variations. However, Ghana recorded a deficit of US$4766.69 million as a trade balance due to its high importation bills. This is not by chance, as Ghana is an importer of high-value-added goods but an exporter of basic agricultural raw materials with low export earnings. The study also finds much larger positive impacts for the AfCFTA regions for both importers and exporters when the NTMs that work as iceberg costs and export costs are reduced. It further finds that by reducing the export cost that increases the cost of intermediate inputs, trade among the AfCFTA regions (intra-AfCFTA trade) is enhanced.

Keywords: impact, AfCFTA, NTMs, Ghana, CGE

Procedia PDF Downloads 9
1564 Net Work Meta Analysis to Identify the Most Effective Dressings to Treat Pressure Injury

Authors: Lukman Thalib, Luis Furuya-Kanamori, Rachel Walker, Brigid Gillespie, Suhail Doi

Abstract:

Background and objectives: There are many topical treatments available for Pressure Injury (PI) treatment, yet there is a lack of evidence with regards to the most effective treatment. The objective of this study was to compare the effect of various topical treatments and identify the best treatment choice(s) for PI healing. Methods: Network meta-analysis of published randomized controlled trials that compared the two or more of the following dressing groups: basic, foam, active, hydroactive, and other wound dressings. The outcome complete healing following treatment and the generalised pair-wise modelling framework was used to generate mixed treatment effects against hydroactive wound dressing, currently the standard of treatment for PIs. All treatments were then ranked by their point estimates. Main Results: 40 studies (1,757 participants) comparing 5 dressing groups were included in the analysis. All dressings groups ranked better than basic (i.e. saline gauze or similar inert dressing). The foam (RR 1.18; 95%CI 0.95-1.48) and active wound dressing (RR 1.16; 95%CI 0.92-1.47) ranked better than hydroactive wound dressing in terms of healing of PIs when the latter was used as the reference group. Conclusion & Recommendations: There was considerable uncertainty around the estimates, yet, the use of hydroactive wound dressings appear to perform better than basic dressings. Foam and active wound dressing groups show promise and need further investigation. High-quality research on clinical effectiveness of the topical treatments are warranted to identify if foam and active wound dressings do provide advantages over hydroactive dressings.

Keywords: Net work Meta Analysis, Pressure Injury, Dresssing, Pressure Ulcer

Procedia PDF Downloads 113
1563 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models

Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue

Abstract:

Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.

Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation

Procedia PDF Downloads 252
1562 Radical Web Text Classification Using a Composite-Based Approach

Authors: Kolade Olawande Owoeye, George R. S. Weir

Abstract:

The widespread of terrorism and extremism activities on the internet has become a major threat to the government and national securities due to their potential dangers which have necessitated the need for intelligence gathering via web and real-time monitoring of potential websites for extremist activities. However, the manual classification for such contents is practically difficult or time-consuming. In response to this challenge, an automated classification system called composite technique was developed. This is a computational framework that explores the combination of both semantics and syntactic features of textual contents of a web. We implemented the framework on a set of extremist webpages dataset that has been subjected to the manual classification process. Therein, we developed a classification model on the data using J48 decision algorithm, this is to generate a measure of how well each page can be classified into their appropriate classes. The classification result obtained from our method when compared with other states of arts, indicated a 96% success rate in classifying overall webpages when matched against the manual classification.

Keywords: extremist, web pages, classification, semantics, posit

Procedia PDF Downloads 143
1561 The Influence of Strategic Networks and Logistics Integration on Company Performance among Small and Medium Enterprises

Authors: Jeremiah Madzimure

Abstract:

In order to stay competitive in business and improve performance, Small and Medium Enterprises (SMEs) need to make use of business networking and logistics integration. Strategic networking and logistics integration in business companies have become critical as they allow supplier partnering, exchange of vital information/ access to valuable resources allowing innovation, gaining access to additional resources, sharing risks and costs which is required for enhancing company performance. The purpose of this study was to examine the influence of strategic networks and logistics integration on company performance: the case of small and medium enterprises in South Africa. A quantitative research design was adopted in this study, and 137 SMEs owners and managers completed and returned the survey questionnaire. Confirmatory Factor Analysis (CFA) was conducted using the Analysis of Moment Structures (AMOS), version 24.0 to assess psychometric properties of the measurement scales. Path modelling techniques were used to test the proposed hypothesis. Three research hypotheses were postulated. The results indicate that strategic networks had a positive and significant influence on logistics integration and company performance. As well logistics integration had a strong positive and significant influence on company performance. This study provides a useful model for analysing the relationship between strategic networks and logistics integration on company performance. Moreover, the findings of the study provide useful insights into how SMEs should benefit from business networking and logistics integration so as to improve their performance. The implications of the study are discussed, and finally, limitations and recommendations are indicated.

Keywords: strategic networking, logistics integration, company performance, SMEs

Procedia PDF Downloads 297
1560 The Use of Degradation Measures to Design Reliability Test Plans

Authors: Stephen V. Crowder, Jonathan W. Lane

Abstract:

With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. In this work we present a case study involving an electronic component subject to degradation. The data, consisting of 42 degradation paths of cycles to failure, are first used to estimate a reliability function. Bootstrapping techniques are then used to perform power studies and develop a minimal reliability test plan for future production of this component.

Keywords: degradation measure, time to failure distribution, bootstrap, computational science

Procedia PDF Downloads 531