Search results for: probabilistic roadmap
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 419

Search results for: probabilistic roadmap

119 Comparative Fragility Analysis of Shallow Tunnels Subjected to Seismic and Blast Loads

Authors: Siti Khadijah Che Osmi, Mohammed Ahmad Syed

Abstract:

Underground structures are crucial components which required detailed analysis and design. Tunnels, for instance, are massively constructed as transportation infrastructures and utilities network especially in urban environments. Considering their prime importance to the economy and public safety that cannot be compromised, thus any instability to these tunnels will be highly detrimental to their performance. Recent experience suggests that tunnels become vulnerable during earthquakes and blast scenarios. However, a very limited amount of studies has been carried out to study and understanding the dynamic response and performance of underground tunnels under those unpredictable extreme hazards. In view of the importance of enhancing the resilience of these structures, the overall aims of the study are to evaluate probabilistic future performance of shallow tunnels subjected to seismic and blast loads by developing detailed fragility analysis. Critical non-linear time history numerical analyses using sophisticated finite element software Midas GTS NX have been presented about the current methods of analysis, taking into consideration of structural typology, ground motion and explosive characteristics, effect of soil conditions and other associated uncertainties on the tunnel integrity which may ultimately lead to the catastrophic failure of the structures. The proposed fragility curves for both extreme loadings are discussed and compared which provide significant information the performance of the tunnel under extreme hazards which may beneficial for future risk assessment and loss estimation.

Keywords: fragility analysis, seismic loads, shallow tunnels, blast loads

Procedia PDF Downloads 318
118 Bottom-up Quantification of Mega Inter-Basin Water Transfer Vulnerability to Climate Change

Authors: Enze Zhang

Abstract:

Large numbers of inter-basin water transfer (IBWT) projects are constructed or proposed all around the world as solutions to water distribution and supply problems. Nowadays, as climate change warms the atmosphere, alters the hydrologic cycle, and perturbs water availability, large scale IBWTs which are sensitive to these water-related changes may carry significant risk. Given this reality, IBWTs have elicited great controversy and assessments of vulnerability to climate change are urgently needed worldwide. In this paper, we consider the South-to-North Water Transfer Project (SNWTP) in China as a case study, and introduce a bottom-up vulnerability assessment framework. Key hazards and risks related to climate change that threaten future water availability for the SNWTP are firstly identified. Then a performance indicator is presented to quantify the vulnerability of IBWT by taking three main elements (i.e., sensitivity, adaptive capacity, and exposure degree) into account. A probabilistic Budyko model is adapted to estimate water availability responses to a wide range of possibilities for future climate conditions in each region of the study area. After bottom-up quantifying the vulnerability based on the estimated water availability, our findings confirm that SNWTP would greatly alleviate geographical imbalances in water availability under some moderate climate change scenarios but raises questions about whether it is a long-term solution because the donor basin has a high level of vulnerability due to extreme climate change.

Keywords: vulnerability, climate change, inter-basin water transfer, bottom-up

Procedia PDF Downloads 377
117 Probabilistic Models to Evaluate Seismic Liquefaction In Gravelly Soil Using Dynamic Penetration Test and Shear Wave Velocity

Authors: Nima Pirhadi, Shao Yong Bo, Xusheng Wan, Jianguo Lu, Jilei Hu

Abstract:

Although gravels and gravelly soils are assumed to be non-liquefiable because of high conductivity and small modulus; however, the occurrence of this phenomenon in some historical earthquakes, especially recently earthquakes during 2008 Wenchuan, Mw= 7.9, 2014 Cephalonia, Greece, Mw= 6.1 and 2016, Kaikoura, New Zealand, Mw = 7.8, has been promoted the essential consideration to evaluate risk assessment and hazard analysis of seismic gravelly soil liquefaction. Due to the limitation in sampling and laboratory testing of this type of soil, in situ tests and site exploration of case histories are the most accepted procedures. Of all in situ tests, dynamic penetration test (DPT), Which is well known as the Chinese dynamic penetration test, and shear wave velocity (Vs) test, have been demonstrated high performance to evaluate seismic gravelly soil liquefaction. However, the lack of a sufficient number of case histories provides an essential limitation for developing new models. This study at first investigates recent earthquakes that caused liquefaction in gravelly soils to collect new data. Then, it adds these data to the available literature’s dataset to extend them and finally develops new models to assess seismic gravelly soil liquefaction. To validate the presented models, their results are compared to extra available models. The results show the reasonable performance of the proposed models and the critical effect of gravel content (GC)% on the assessment.

Keywords: liquefaction, gravel, dynamic penetration test, shear wave velocity

Procedia PDF Downloads 180
116 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques

Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba

Abstract:

The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.

Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry

Procedia PDF Downloads 164
115 Downscaling Seasonal Sea Surface Temperature Forecasts over the Mediterranean Sea Using Deep Learning

Authors: Redouane Larbi Boufeniza, Jing-Jia Luo

Abstract:

This study assesses the suitability of deep learning (DL) for downscaling sea surface temperature (SST) over the Mediterranean Sea in the context of seasonal forecasting. We design a set of experiments that compare different DL configurations and deploy the best-performing architecture to downscale one-month lead forecasts of June–September (JJAS) SST from the Nanjing University of Information Science and Technology Climate Forecast System version 1.0 (NUIST-CFS1.0) for the period of 1982–2020. We have also introduced predictors over a larger area to include information about the main large-scale circulations that drive SST over the Mediterranean Sea region, which improves the downscaling results. Finally, we validate the raw model and downscaled forecasts in terms of both deterministic and probabilistic verification metrics, as well as their ability to reproduce the observed precipitation extreme and spell indicator indices. The results showed that the convolutional neural network (CNN)-based downscaling consistently improves the raw model forecasts, with lower bias and more accurate representations of the observed mean and extreme SST spatial patterns. Besides, the CNN-based downscaling yields a much more accurate forecast of extreme SST and spell indicators and reduces the significant relevant biases exhibited by the raw model predictions. Moreover, our results show that the CNN-based downscaling yields better skill scores than the raw model forecasts over most portions of the Mediterranean Sea. The results demonstrate the potential usefulness of CNN in downscaling seasonal SST predictions over the Mediterranean Sea, particularly in providing improved forecast products.

Keywords: Mediterranean Sea, sea surface temperature, seasonal forecasting, downscaling, deep learning

Procedia PDF Downloads 56
114 Application Reliability Method for Concrete Dams

Authors: Mustapha Kamel Mihoubi, Mohamed Essadik Kerkar

Abstract:

Probabilistic risk analysis models are used to provide a better understanding of the reliability and structural failure of works, including when calculating the stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of reliability analysis methods including the methods used in engineering. It is in our case, the use of level 2 methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type first order risk method (FORM) and the second order risk method (SORM). By way of comparison, a level three method was used which generates a full analysis of the problem and involves an integration of the probability density function of random variables extended to the field of security using the Monte Carlo simulation method. Taking into account the change in stress following load combinations: normal, exceptional and extreme acting on the dam, calculation of the results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities, thus causing a significant decrease in strength, shear forces then induce a shift that threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case the increase of uplift in a hypothetical default of the drainage system.

Keywords: dam, failure, limit-state, monte-carlo, reliability, probability, simulation, sliding, taylor

Procedia PDF Downloads 303
113 Development and Optimization of German Diagnostical Tests in Mathematics for Vocational Training

Authors: J. Thiele

Abstract:

Teachers working at vocational Colleges are often confronted with the problem, that many students graduated from different schools and therefore each had a different education. Especially in mathematics many students lack fundamentals or had different priorities at their previous schools. Furthermore, these vocational Colleges have to provide Graduations for many different working-fields, with different core themes. The Colleges are interested in measuring the different Education levels of their students and providing assistance for those who need to catch up. The Project mathe-meistern was initiated to remedy this problem at vocational Colleges. For this purpose, online-tests were developed. The aim of these tests is to evaluate basic mathematical abilities of the students. The tests are online Multiple-Choice-Tests with a total of 65 Items. They are accessed online with a unique Transaction-Number (TAN) for each participant. The content is divided in several Categories (Arithmetic, Algebra, Fractions, Geometry, etc.). After each test, the student gets a personalized summary depicting their strengths and weaknesses in mathematical Basics. Teachers can visit a special website to examine the results of their classes or single students. In total 5830 students did participate so far. For standardization and optimization purposes the tests are being evaluated, using the classic and probabilistic Test-Theory regarding Objectivity, Reliability and Validity, annually since 2015. This Paper is about the Optimization process considering the Rasch-scaling and Standardization of the tests. Additionally, current results using standardized tests will be discussed. To achieve this Competence levels and Types of errors of students attending vocational Colleges in Nordrheinwestfalen, Germany, were determined, using descriptive Data and Distractorevaluations.

Keywords: diagnostical tests in mathematics, distractor devaluation, test-optimization, test-theory

Procedia PDF Downloads 105
112 Evaluation of Different Waste Management Planning Strategies in an Industrial City

Authors: Leila H. Khiabani, Mohammadreza Vafaee, Farshad Hashemzadeh

Abstract:

Industrial waste management regulates different stages of production, storage, transfer, recycling and waste disposal. There are several common practices for industrial waste management. However, due to various local health, economic, social, environmental and aesthetic considerations, the most optimal principles and measures often vary at each specific industrial zone. In addition, waste management strategies are heavily impacted by local administrative, legal, and financial regulations. In this study, a hybrid qualitative and quantitative research methodology has been designed for waste management planning in an industrial city. Firstly, following a qualitative research methodology, the most relevant waste management strategies for the specific industrial city were identified through interviews with environmental planning and waste management experts. Forty experts participated in this study. Alborz industrial city in Iran, which hosts more than one thousand industrial units in nine hundred acres, was chosen as the sample industrial city in this study. The findings from the expert interviews at the first phase were then used to design a quantitative questionnaire for the second phase of the study. The aim of the questionnaire was to quantify the relative impact of different waste management strategies in the sample industrial city. Eight waste management strategies and three implementation policies were included in the questionnaire. The experts were asked to rank the relative effectiveness of each strategy for environmental planning of the sample industrial city. They were also asked to rank the relative effectiveness of each planning policy on each of the waste management strategies. In the end, the weighted average of all the responses was calculated to identify the most effective waste management strategy and planning policies for the sample industrial city. The results suggested that among the eight suggested waste management strategies, industrial composting is the most effective (31%) strategy based on the collective evaluation of the local expert. Additionally, the results suggested that the most effective policy (58%) in the city’s environmental planning is to reduce waste generation by prolonging the effective life of industrial products using higher quality and recyclable materials. These findings can provide useful expert guidelines for prioritization between different waste management strategies in the city’s overall environmental planning roadmap. The findings may also be applicable to similar industrial cities. In addition, a similar methodology can be utilized in the environmental planning of other industrial cities.

Keywords: environmental planning, industrial city, quantitative research, waste management

Procedia PDF Downloads 117
111 A Three Elements Vector Valued Structure’s Ultimate Strength-Strong Motion-Intensity Measure

Authors: A. Nicknam, N. Eftekhari, A. Mazarei, M. Ganjvar

Abstract:

This article presents an alternative collapse capacity intensity measure in the three elements form which is influenced by the spectral ordinates at periods longer than that of the first mode period at near and far source sites. A parameter, denoted by β, is defined by which the spectral ordinate effects, up to the effective period (2T_1), on the intensity measure are taken into account. The methodology permits to meet the hazard-levelled target extreme event in the probabilistic and deterministic forms. A MATLAB code is developed involving OpenSees to calculate the collapse capacities of the 8 archetype RC structures having 2 to 20 stories for regression process. The incremental dynamic analysis (IDA) method is used to calculate the structure’s collapse values accounting for the element stiffness and strength deterioration. The general near field set presented by FEMA is used in a series of performing nonlinear analyses. 8 linear relationships are developed for the 8structutres leading to the correlation coefficient up to 0.93. A collapse capacity near field prediction equation is developed taking into account the results of regression processes obtained from the 8 structures. The proposed prediction equation is validated against a set of actual near field records leading to a good agreement. Implementation of the proposed equation to the four archetype RC structures demonstrated different collapse capacities at near field site compared to those of FEMA. The reasons of differences are believed to be due to accounting for the spectral shape effects.

Keywords: collapse capacity, fragility analysis, spectral shape effects, IDA method

Procedia PDF Downloads 210
110 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining

Authors: Mohsen Farhadloo, Majid Farhadloo

Abstract:

Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.

Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis

Procedia PDF Downloads 79
109 Navigating Disruption: Key Principles and Innovations in Modern Management for Organizational Success

Authors: Ahmad Haidar

Abstract:

This research paper investigates the concept of modern management, concentrating on the development of managerial practices and the adoption of innovative strategies in response to the fast-changing business landscape caused by Artificial Intelligence (AI). The study begins by examining the historical context of management theories, tracing the progression from classical to contemporary models, and identifying key drivers of change. Through a comprehensive review of existing literature and case studies, this paper provides valuable insights into the principles and practices of modern management, offering a roadmap for organizations aiming to navigate the complexities of the contemporary business world. The paper examines the growing role of digital technology in modern management, focusing on incorporating AI, machine learning, and data analytics to streamline operations and facilitate informed decision-making. Moreover, the research highlights the emergence of new principles, such as adaptability, flexibility, public participation, trust, transparency, and digital mindset, as crucial components of modern management. Also, the role of business leaders is investigated by studying contemporary leadership styles, such as transformational, situational, and servant leadership, emphasizing the significance of emotional intelligence, empathy, and collaboration in fostering a healthy organizational culture. Furthermore, the research delves into the crucial role of environmental sustainability, corporate social responsibility (CSR), and corporate digital responsibility (CDR). Organizations strive to balance economic growth with ethical considerations and long-term viability. The primary research question for this study is: "What are the key principles, practices, and innovations that define modern management, and how can organizations effectively implement these strategies to thrive in the rapidly changing business landscape?." The research contributes to a comprehensive understanding of modern management by examining its historical context, the impact of digital technologies, the importance of contemporary leadership styles, and the role of CSR and CDR in today's business landscape.

Keywords: modern management, digital technology, leadership styles, adaptability, innovation, corporate social responsibility, organizational success, corporate digital responsibility

Procedia PDF Downloads 45
108 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds

Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine

Abstract:

The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.

Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits

Procedia PDF Downloads 62
107 The Impact of Window Opening Occupant Behavior Models on Building Energy Performance

Authors: Habtamu Tkubet Ebuy

Abstract:

Purpose Conventional dynamic energy simulation tools go beyond the static dimension of simplified methods by providing better and more accurate prediction of building performance. However, their ability to forecast actual performance is undermined by a low representation of human interactions. The purpose of this study is to examine the potential benefits of incorporating information on occupant diversity into occupant behavior models used to simulate building performance. The co-simulation of the stochastic behavior of the occupants substantially increases the accuracy of the simulation. Design/methodology/approach In this article, probabilistic models of the "opening and closing" behavior of the window of inhabitants have been developed in a separate multi-agent platform, SimOcc, and implemented in the building simulation, TRNSYS, in such a way that the behavior of the window with the interconnectivity can be reflected in the simulation analysis of the building. Findings The results of the study prove that the application of complex behaviors is important to research in predicting actual building performance. The results aid in the identification of the gap between reality and existing simulation methods. We hope this study and its results will serve as a guide for researchers interested in investigating occupant behavior in the future. Research limitations/implications Further case studies involving multi-user behavior for complex commercial buildings need to more understand the impact of the occupant behavior on building performance. Originality/value This study is considered as a good opportunity to achieve the national strategy by showing a suitable tool to help stakeholders in the design phase of new or retrofitted buildings to improve the performance of office buildings.

Keywords: occupant behavior, co-simulation, energy consumption, thermal comfort

Procedia PDF Downloads 78
106 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 442
105 Climate Change Adaptation Strategy Recommended for the Conservation of Biodiversity in Western Ghats, India

Authors: Mukesh Lal Das, Muthukumar Muthuchamy

Abstract:

Climate change Adaptation strategy (AS) is a scientific approach to dealing with the impacts of climate change (CC). Efforts are being made to contain the global emission of greenhouse gas within threshold limits, thereby limiting the rise of global temperature to an optimal level. Global Climate change is a spontaneous process; therefore, reversing the damage would take decades. The climate change adaptation strategy recommended by various stakeholders could be a key to resilience for biodiversity. The Indian Government has constituted the panel to synthesize the climate change action report at the federal and state levels. This review scavenged the published literature on the Western Ghats hotspots. And highlight the adaptation strategy recommended by diverse scientific actors to conserve biodiversity. It also reviews the grey literature adopted by state and federal governments and its effectiveness in mitigating the impacts on biodiversity. We have narrowed the scope of interest to the state action report by 6 Indian states such as Gujarat, Maharashtra, Goa, Karnataka, Kerala and Tamil Nadu, which host Western Ghats global biodiversity hotspot. Western Ghats(WGs) act as the water tower to the peninsular part of India, and its extensive watershed caters to the water demand of the Industry sector, Agriculture and urban community. Conservation of WGs is the key to the prosperity of Peninsular India. The global scientific community suggested more than 600+ Climate change adaptation strategies for the policymakers, stakeholders, and other state actors to take proactive actions. The preliminary analysis of the federal and the state action plan on climate change in the wake of CC indicate inadequacy in motion as per recommended scientific adaptation strategies. Tamil Nadu and Kerala state constitute nine effective adaptation strategies out of the 40+ recommended for Western Ghats conservation. And other four states' adaptation strategies are deficient, confusing and vague. Western Ghats' resilience capacity will soon or might have reached its threshold, and the frequency of severe drought and flash floods might upsurge manifold in the decades to come. The lack of a clear roadmap to climate change adaptation strategies in the federal and state action stirred us to identify the gap and address it by offering a holistic approach to WGs biodiversity conservation.

Keywords: adaptation strategy, biodiversity conservation, climate change, resilience, Western Ghats

Procedia PDF Downloads 86
104 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function

Procedia PDF Downloads 278
103 Seismic Loss Assessment for Peruvian University Buildings with Simulated Fragility Functions

Authors: Jose Ruiz, Jose Velasquez, Holger Lovon

Abstract:

Peruvian university buildings are critical structures for which very little research about its seismic vulnerability is available. This paper develops a probabilistic methodology that predicts seismic loss for university buildings with simulated fragility functions. Two university buildings located in the city of Cusco were analyzed. Fragility functions were developed considering seismic and structural parameters uncertainty. The fragility functions were generated with the Latin Hypercube technique, an improved Montecarlo-based method, which optimizes the sampling of structural parameters and provides at least 100 reliable samples for every level of seismic demand. Concrete compressive strength, maximum concrete strain and yield stress of the reinforcing steel were considered as the key structural parameters. The seismic demand is defined by synthetic records which are compatible with the elastic Peruvian design spectrum. Acceleration records are scaled based on the peak ground acceleration on rigid soil (PGA) which goes from 0.05g to 1.00g. A total of 2000 structural models were considered to account for both structural and seismic variability. These functions represent the overall building behavior because they give rational information regarding damage ratios for defined levels of seismic demand. The university buildings show an expected Mean Damage Factor of 8.80% and 19.05%, respectively, for the 0.22g-PGA scenario, which was amplified by the soil type coefficient and resulted in 0.26g-PGA. These ratios were computed considering a seismic demand related to 10% of probability of exceedance in 50 years which is a requirement in the Peruvian seismic code. These results show an acceptable seismic performance for both buildings.

Keywords: fragility functions, university buildings, loss assessment, Montecarlo simulation, latin hypercube

Procedia PDF Downloads 116
102 Influence of Random Fibre Packing on the Compressive Strength of Fibre Reinforced Plastic

Authors: Y. Wang, S. Zhang, X. Chen

Abstract:

The longitudinal compressive strength of fibre reinforced plastic (FRP) possess a large stochastic variability, which limits efficient application of composite structures. This study aims to address how the random fibre packing affects the uncertainty of FRP compressive strength. An novel approach is proposed to generate random fibre packing status by a combination of Latin hypercube sampling and random sequential expansion. 3D nonlinear finite element model is built which incorporates both the matrix plasticity and fibre geometrical instability. The matrix is modeled by isotropic ideal elasto-plastic solid elements, and the fibres are modeled by linear-elastic rebar elements. Composite with a series of different nominal fibre volume fractions are studied. Premature fibre waviness at different magnitude and direction is introduced in the finite element model. Compressive tests on uni-directional CFRP (carbon fibre reinforced plastic) are conducted following the ASTM D6641. By a comparison of 3D FE models and compressive tests, it is clearly shown that the stochastic variation of compressive strength is partly caused by the random fibre packing, and normal or lognormal distribution tends to be a good fit the probabilistic compressive strength. Furthermore, it is also observed that different random fibre packing could trigger two different fibre micro-buckling modes while subjected to longitudinal compression: out-of-plane buckling and twisted buckling. The out-of-plane buckling mode results much larger compressive strength, and this is the major reason why the random fibre packing results a large uncertainty in the FRP compressive strength. This study would contribute to new approaches to the quality control of FRP considering higher compressive strength or lower uncertainty.

Keywords: compressive strength, FRP, micro-buckling, random fibre packing

Procedia PDF Downloads 255
101 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 118
100 Nonlinear Finite Element Modeling of Deep Beam Resting on Linear and Nonlinear Random Soil

Authors: M. Seguini, D. Nedjar

Abstract:

An accuracy nonlinear analysis of a deep beam resting on elastic perfectly plastic soil is carried out in this study. In fact, a nonlinear finite element modeling for large deflection and moderate rotation of Euler-Bernoulli beam resting on linear and nonlinear random soil is investigated. The geometric nonlinear analysis of the beam is based on the theory of von Kàrmàn, where the Newton-Raphson incremental iteration method is implemented in a Matlab code to solve the nonlinear equation of the soil-beam interaction system. However, two analyses (deterministic and probabilistic) are proposed to verify the accuracy and the efficiency of the proposed model where the theory of the local average based on the Monte Carlo approach is used to analyze the effect of the spatial variability of the soil properties on the nonlinear beam response. The effect of six main parameters are investigated: the external load, the length of a beam, the coefficient of subgrade reaction of the soil, the Young’s modulus of the beam, the coefficient of variation and the correlation length of the soil’s coefficient of subgrade reaction. A comparison between the beam resting on linear and nonlinear soil models is presented for different beam’s length and external load. Numerical results have been obtained for the combination of the geometric nonlinearity of beam and material nonlinearity of random soil. This comparison highlighted the need of including the material nonlinearity and spatial variability of the soil in the geometric nonlinear analysis, when the beam undergoes large deflections.

Keywords: finite element method, geometric nonlinearity, material nonlinearity, soil-structure interaction, spatial variability

Procedia PDF Downloads 393
99 Modeling the Reliability of a Fuel Cell and the Influence of Mechanical Aspects on the Production of Electrical Energy

Authors: Raed Kouta

Abstract:

A fuel cell is a multi-physical system. Its electrical performance depends on chemical, electrochemical, fluid, and mechanical parameters. Many studies focus on physical and chemical aspects. Our study contributes to the evaluation of the influence of mechanical aspects on the performance of a fuel cell. This study is carried out as part of a reliability approach. Reliability modeling allows to consider the uncertainties of the incoming parameters and the probabilistic modeling of the outgoing parameters. The fuel cell studied is the one often used in land, sea, or air transport. This is the Low-Temperature Proton Exchange Membrane Fuel Cell (PEMFC). This battery can provide the required power level. One of the main scientific and technical challenges in mastering the design and production of a fuel cell is to know its behavior in its actual operating environment. The study proposes to highlight the influence on the production of electrical energy: Mechanical design and manufacturing parameters and their uncertainties (Young module, GDL porosity, permeability, etc.). The influence of the geometry of the bipolar plates is also considered. An experimental design is proposed with two types of materials as well as three geometric shapes for three joining pressures. Other experimental designs are also proposed for studying the influence of uncertainties of mechanical parameters on cell performance. - Mechanical (static, dynamic) and thermal (tightening - compression, vibrations (road rolling and tests on vibration-climatic bench, etc.) loads. This study is also carried out according to an experimental scheme on a fuel cell system for vibration loads recorded on a vehicle test track with three temperatures and three expected performance levels. The work will improve the coupling between mechanical, physical, and chemical phenomena.

Keywords: fuel cell, mechanic, reliability, uncertainties

Procedia PDF Downloads 164
98 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods

Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim

Abstract:

Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.

Keywords: economical analysis, probability of failure, retaining walls, statistical analysis

Procedia PDF Downloads 388
97 Partition of Nonylphenol between Different Compartment for Mother-Fetus Pairs and Health Effects of Newborns

Authors: Chun-Hao Lai, Yu-Fang Huang, Pei-Wei Wang, Meng-Han Lin, Mei-Lien Chen

Abstract:

Nonylphenol (NP) is a degradation product of nonylphenol ethoxylates (NPEOs). It is a well-known endocrine disruptor which may cause estrogenic effects. The growing fetus and infants are more vulnerable to exposure to NP than adults. It is important to know the levels and influences of prenatal exposure to NP. The aims of this study were (1) to determine the levels of prenatal exposure among Taiwanese, (2) to evaluate the potential risk for the infants who were breastfed and exposed to NP through the milk. (3) To investigate the correlation between birth outcomes and prenatal exposure to NP. We analyzed thirty one pairs of maternal urines, placentas, first month’ breast milk by high-performance liquid chromatography coupling with fluorescence detector. The questionnaire included socio- demographics, lifestyle, delivery method, dietary and work history. Information about the birth outcomes were obtained from medical records. The daily intake of NP from breast milk was calculated using deterministic and probabilistic risk assessment methods. The geometric means and geometric standard deviation of NP levels in placenta, and breast milk in the first month were 31.2 (1.8) ng/g, 17.2 (1.6) ng/g, respectively. The medium of daily intake NP in breast milk was 1.33 μg/kg-bw/day in the first month. We found negative association between NP levels of placenta and birth height. And we observed negative correlation between maternal urine NP levels and birth weight. In this study, we could provide the NP exposure profile among Taiwan pregnant women and the daily intake of NP in Taiwan infants. Prenatal exposure to higher levels of NP may increase the risk of lower birth weight and shorter birth height.

Keywords: nonylphenol, mother, fetus, placenta, breast milk, urine

Procedia PDF Downloads 215
96 An Empirical Study for the Data-Driven Digital Transformation of the Indian Telecommunication Service Providers

Authors: S. Jigna, K. Nanda Kumar, T. Anna

Abstract:

Being a major contributor to the Indian economy and a critical facilitator for the country’s digital India vision, the Indian telecommunications industry is also a major source of employment for the country. Since the last few years, the Indian telecommunication service providers (TSPs), however, are facing business challenges related to increasing competition, losses, debts, and decreasing revenue. The strategic use of digital technologies for a successful digital transformation has the potential to equip organizations to meet these business challenges. Despite an increased focus on digital transformation, the telecom service providers globally, including Indian TSPs, have seen limited success so far. The purpose of this research was thus to identify the factors that are critical for the digital transformation and to what extent they influence the successful digital transformation of the Indian TSPs. The literature review of more than 300 digital transformation-related articles, mostly from 2013-2019, demonstrated a lack of an empirical model consisting of factors for the successful digital transformation of the TSPs. This study theorizes a research framework grounded in multiple theories, and a research model consisting of 7 constructs that may be influencing business success during the digital transformation of the organization was proposed. The questionnaire survey of senior managers in the Indian telecommunications industry was seeking to validate the research model. Based on 294 survey responses, the validation of the Structural equation model using the statistical tool ADANCO 2.1.1 was found to be robust. Results indicate that Digital Capabilities, Digital Strategy, and Corporate Level Data Strategy in that order has a strong influence on the successful Business Performance, followed by IT Function Transformation, Digital Innovation, and Transformation Management respectively. Even though Digital Organization did not have a direct significance on Business Performance outcomes, it had a strong influence on IT Function Transformation, thus affecting the Business Performance outcomes indirectly. Amongst numerous practical and theoretical contributions of the study, the main contribution for the Indian TSPs is a validated reference for prioritizing the transformation initiatives in their strategic roadmap. Also, the main contribution to the theory is the possibility to use the research framework artifact of the present research for quantitative validation in different industries and geographies.

Keywords: corporate level data strategy, digital capabilities, digital innovation, digital strategy

Procedia PDF Downloads 105
95 Ground Response Analysis at the Rukni Irrigation Project Site Located in Assam, India

Authors: Tauhidur Rahman, Kasturi Bhuyan

Abstract:

In the present paper, Ground Response Analysis at the Rukni irrigation project has been thoroughly investigated. Surface level seismic hazard is mainly used by the practical Engineers for designing the important structures. Surface level seismic hazard can be obtained accounting the soil factor. Structures on soft soil will show more ground shaking than the structure located on a hard soil. The Surface level ground motion depends on the type of soil. Density and shear wave velocity is different for different types of soil. The intensity of the soil amplification depends on the density and shear wave velocity of the soil. Rukni irrigation project is located in the North Eastern region of India, near the Dauki fault (550 Km length) which has already produced earthquakes of magnitude (Mw= 8.5) in the past. There is a probability of a similar type of earthquake occuring in the future. There are several faults also located around the project site. There are 765 recorded strong ground motion time histories available for the region. These data are used to determine the soil amplification factor by incorporation of the engineering properties of soil. With this in view, three of soil bore holes have been studied at the project site up to a depth of 30 m. It has been observed that in Soil bore hole 1, the shear wave velocity vary from 99.44 m/s to 239.28 m/s. For Soil Bore Hole No 2 and 3, shear wave velocity vary from 93.24 m/s to 241.39 m/s and 93.24m/s to 243.01 m/s. In the present work, surface level seismic hazard at the project site has been calculated based on the Probabilistic seismic hazard approach accounting the soil factor.

Keywords: Ground Response Analysis, shear wave velocity, soil amplification, surface level seismic hazard

Procedia PDF Downloads 533
94 O-LEACH: The Problem of Orphan Nodes in the LEACH of Routing Protocol for Wireless Sensor Networks

Authors: Wassim Jerbi, Abderrahmen Guermazi, Hafedh Trabelsi

Abstract:

The optimum use of coverage in wireless sensor networks (WSNs) is very important. LEACH protocol called Low Energy Adaptive Clustering Hierarchy, presents a hierarchical clustering algorithm for wireless sensor networks. LEACH is a protocol that allows the formation of distributed cluster. In each cluster, LEACH randomly selects some sensor nodes called cluster heads (CHs). The selection of CHs is made with a probabilistic calculation. It is supposed that each non-CH node joins a cluster and becomes a cluster member. Nevertheless, some CHs can be concentrated in a specific part of the network. Thus, several sensor nodes cannot reach any CH. to solve this problem. We created an O-LEACH Orphan nodes protocol, its role is to reduce the sensor nodes which do not belong the cluster. The cluster member called Gateway receives messages from neighboring orphan nodes. The gateway informs CH having the neighboring nodes that not belong to any group. However, Gateway called (CH') attaches the orphaned nodes to the cluster and then collected the data. O-Leach enables the formation of a new method of cluster, leads to a long life and minimal energy consumption. Orphan nodes possess enough energy and seeks to be covered by the network. The principal novel contribution of the proposed work is O-LEACH protocol which provides coverage of the whole network with a minimum number of orphaned nodes and has a very high connectivity rates.As a result, the WSN application receives data from the entire network including orphan nodes. The proper functioning of the Application requires, therefore, management of intelligent resources present within each the network sensor. The simulation results show that O-LEACH performs better than LEACH in terms of coverage, connectivity rate, energy and scalability.

Keywords: WSNs; routing; LEACH; O-LEACH; Orphan nodes; sub-cluster; gateway; CH’

Procedia PDF Downloads 348
93 Developing a Framework for Assessing and Fostering the Sustainability of Manufacturing Companies

Authors: Ilaria Barletta, Mahesh Mani, Björn Johansson

Abstract:

The concept of sustainability encompasses economic, environmental, social and institutional considerations. Sustainable manufacturing (SM) is, therefore, a multi-faceted concept. It broadly implies the development and implementation of technologies, projects and initiatives that are concerned with the life cycle of products and services, and are able to bring positive impacts to the environment, company stakeholders and profitability. Because of this, achieving SM-related goals requires a holistic, life-cycle-thinking approach from manufacturing companies. Further, such an approach must rely on a logic of continuous improvement and ease of implementation in order to be effective. Currently, there exists in the academic literature no comprehensively structured frameworks that support manufacturing companies in the identification of the issues and the capabilities that can either hinder or foster sustainability. This scarcity of support extends to difficulties in obtaining quantifiable measurements in order to objectively evaluate solutions and programs and identify improvement areas within SM for standards conformance. To bridge this gap, this paper proposes the concept of a framework for assessing and continuously improving the sustainability of manufacturing companies. The framework addresses strategies and projects for SM and operates in three sequential phases: analysis of the issues, design of solutions and continuous improvement. A set of interviews, observations and questionnaires are the research methods to be used for the implementation of the framework. Different decision-support methods - either already-existing or novel ones - can be 'plugged into' each of the phases. These methods can assess anything from business capabilities to process maturity. In particular, the authors are working on the development of a sustainable manufacturing maturity model (SMMM) as decision support within the phase of 'continuous improvement'. The SMMM, inspired by previous maturity models, is made up of four maturity levels stemming from 'non-existing' to 'thriving'. Aggregate findings from the use of the framework should ultimately reveal to managers and CEOs the roadmap for achieving SM goals and identify the maturity of their companies’ processes and capabilities. Two cases from two manufacturing companies in Australia are currently being employed to develop and test the framework. The use of this framework will bring two main benefits: enable visual, intuitive internal sustainability benchmarking and raise awareness of improvement areas that lead companies towards an increasingly developed SM.

Keywords: life cycle management, continuous improvement, maturity model, sustainable manufacturing

Procedia PDF Downloads 233
92 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 299
91 Knowledge Creation and Diffusion Dynamics under Stable and Turbulent Environment for Organizational Performance Optimization

Authors: Jessica Gu, Yu Chen

Abstract:

Knowledge Management (KM) is undoubtable crucial to organizational value creation, learning, and adaptation. Although the rapidly growing KM domain has been fueled with full-fledged methodologies and technologies, studies on KM evolution that bridge the organizational performance and adaptation to the organizational environment are still rarely attempted. In particular, creation (or generation) and diffusion (or share/exchange) of knowledge are of the organizational primary concerns on the problem-solving perspective, however, the optimized distribution of knowledge creation and diffusion endeavors are still unknown to knowledge workers. This research proposed an agent-based model of knowledge creation and diffusion in an organization, aiming at elucidating how the intertwining knowledge flows at microscopic level lead to optimized organizational performance at macroscopic level through evolution, and exploring what exogenous interventions by the policy maker and endogenous adjustments of the knowledge workers can better cope with different environmental conditions. With the developed model, a series of simulation experiments are conducted. Both long-term steady-state and time-dependent developmental results on organizational performance, network and structure, social interaction and learning among individuals, knowledge audit and stocktaking, and the likelihood of choosing knowledge creation and diffusion by the knowledge workers are obtained. One of the interesting findings reveals a non-monotonic phenomenon on organizational performance under turbulent environment while a monotonic phenomenon on organizational performance under a stable environment. Hence, whether the environmental condition is turbulence or stable, the most suitable exogenous KM policy and endogenous knowledge creation and diffusion choice adjustments can be identified for achieving the optimized organizational performance. Additional influential variables are further discussed and future work directions are finally elaborated. The proposed agent-based model generates evidence on how knowledge worker strategically allocates efforts on knowledge creation and diffusion, how the bottom-up interactions among individuals lead to emerged structure and optimized performance, and how environmental conditions bring in challenges to the organization system. Meanwhile, it serves as a roadmap and offers great macro and long-term insights to policy makers without interrupting the real organizational operation, sacrificing huge overhead cost, or introducing undesired panic to employees.

Keywords: knowledge creation, knowledge diffusion, agent-based modeling, organizational performance, decision making evolution

Procedia PDF Downloads 217
90 Transformative Measures in Chemical and Petrochemical Industry Through Agile Principles and Industry 4.0 Technologies

Authors: Bahman Ghorashi

Abstract:

The immense awareness of the global climate change has compelled traditional fossil fuel companies to develop strategies to reduce their carbon footprint and simultaneously consider the production of various sources of clean energy in order to mitigate the environmental impact of their operations. Similarly, supply chain issues, the scarcity of certain raw materials, energy costs as well as market needs, and changing consumer expectations have forced the traditional chemical industry to reexamine their time-honored modes of operation. This study examines how such transformative change might occur through the applications of agile principles as well as industry 4.0 technologies. Clearly, such a transformation is complex, costly, and requires a total commitment on the part of the top leadership and the entire management structure. Factors that need to be considered include organizational speed of change, a restructuring that would lend itself toward collaboration and the selling of solutions to customers’ problems, rather than just products, integrating ‘along’ as well as ‘across’ value chains, mastering change and uncertainty as well as a recognition of the importance of concept-to-cash time, i.e., the velocity of introducing new products to market, and the leveraging of people and information. At the same time, parallel to implementing such major shifts in the ethos, and the fabric of the organization, the change leaders should remain mindful of the companies’ DNA while incorporating the necessary DNA defying shifts. Furthermore, such strategic maneuvers should inevitably incorporate the managing of the upstream and downstream operations, harnessing future opportunities, preparing and training the workforce, implementing faster decision making and quick adaptation to change, managing accelerated response times, as well as forming autonomous and cross-functional teams. Moreover, the leaders should establish the balance between high-value solutions versus high-margin products, fully implement digitization of operations and, when appropriate, incorporate the latest relevant technologies, such as: AI, IIoT, ML, and immersive technologies. This study presents a summary of the agile principles and the relevant technologies and draws lessons from some of the best practices that are already implemented within the chemical industry in order to establish a roadmap to agility. Finally, the critical role of educational institutions in preparing the future workforce for Industry 4.0 is addressed.

Keywords: agile principles, immersive technologies, industry 4.0, workforce preparation

Procedia PDF Downloads 89