Search results for: Probabilistic time-varying delays
112 A Bayesian Network Reliability Modeling for FlexRay Systems
Authors: Kuen-Long Leu, Yung-Yuan Chen, Chin-Long Wey, Jwu-E Chen, Chung-Hsien Hsu
Abstract:
The increasing importance of FlexRay systems in automotive domain inspires unceasingly relative researches. One primary issue among researches is to verify the reliability of FlexRay systems either from protocol aspect or from system design aspect. However, research rarely discusses the effect of network topology on the system reliability. In this paper, we will illustrate how to model the reliability of FlexRay systems with various network topologies by a well-known probabilistic reasoning technology, Bayesian Network. In this illustration, we especially investigate the effectiveness of error containment built in star topology and fault-tolerant midpoint synchronization algorithm adopted in FlexRay communication protocol. Through a FlexRay steer-by-wire case study, the influence of different topologies on the failure probability of the FlexRay steerby- wire system is demonstrated. The notable value of this research is to show that the Bayesian Network inference is a powerful and feasible method for the reliability assessment of FlexRay systems.Keywords: Bayesian Network, FlexRay, fault tolerance, network topology, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2029111 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge
Authors: M. F. Yilmaz, B. Ö. Çağlayan
Abstract:
Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.
Keywords: Railway bridges, earthquake performance, fragility analyses, selection of intensity measures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 902110 Self-Sensing versus Reference Air Gaps
Authors: Alexander Schulz, Ingrid Rottensteiner, Manfred Neumann, Michael Wehse, Johann Wassermann
Abstract:
Self-sensing estimates the air gap within an electro magnetic path by analyzing the bearing coil current and/or voltage waveform. The self-sensing concept presented in this paper has been developed within the research project “Active Magnetic Bearings with Supreme Reliability" and is used for position sensor fault detection. Within this new concept gap calculation is carried out by an alldigital analysis of the digitized coil current and voltage waveform. For analysis those time periods within the PWM period are used, which give the best results. Additionally, the concept allows the digital compensation of nonlinearities, for example magnetic saturation, without degrading signal quality. This increases the accuracy and robustness of the air gap estimation and additionally reduces phase delays. Beneath an overview about the developed concept first measurement results are presented which show the potential of this all-digital self-sensing concept.Keywords: digital signal analysis, active magnetic bearing, reliability, fault detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468109 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions
Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad
Abstract:
This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.
Keywords: Fragility analysis, seismic performance, tunnel lining, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390108 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: Multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, Importance sampling, approximate posterior distribution, Marginal likelihood evidence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614107 Prediction of Reusability of Object Oriented Software Systems using Clustering Approach
Authors: Anju Shri, Parvinder S. Sandhu, Vikas Gupta, Sanyam Anand
Abstract:
In literature, there are metrics for identifying the quality of reusable components but the framework that makes use of these metrics to precisely predict reusability of software components is still need to be worked out. These reusability metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the software component and hence improve the productivity due to probabilistic increase in the reuse level. As CK metric suit is most widely used metrics for extraction of structural features of an object oriented (OO) software; So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO and LCOM, is used to obtain the structural analysis of OO-based software components. An algorithm has been proposed in which the inputs can be given to K-Means Clustering system in form of tuned values of the OO software component and decision tree is formed for the 10-fold cross validation of data to evaluate the in terms of linguistic reusability value of the component. The developed reusability model has produced high precision results as desired.Keywords: CK-Metric, Desicion Tree, Kmeans, Reusability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913106 Assessing Traffic Calming Measures for Safe and Accessible Emergency Routes in Norrkoping City in Sweden
Authors: Ghazwan Al-Haji
Abstract:
Most accidents occur in urban areas, and the most related casualties are vulnerable road users (pedestrians and cyclists). The traffic calming measures (TCMs) are widely used and considered to be successful in reducing speed and traffic volume. However, TCMs create unwanted effects include: noise, emissions, energy consumption, vehicle delays and emergency response time (ERT). Different vertical and horizontal TCMs have been already applied nationally (Sweden) and internationally with different impacts. It is a big challenge among traffic engineers, planners, and policy-makers to choose and priorities the best TCMs to be implemented. This study will assess the existing guidelines for TCMs in relation to safety and ERT with focus on data from Norrkoping city in Sweden. The expected results will save lives, time, and money on particularly Swedish Roads. The study will also review newly technologies and how they can improve safety and reduce ERT.
Keywords: Traffic safety, traffic calming measures, speeding, emergency response time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880105 Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment
Authors: Fares Innal, Yves Dutuit, Mourad Chebila
Abstract:
The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.
Keywords: Fuzzy sets, Monte Carlo simulation, Safety instrumented system, Safety integrity level.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2779104 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground
Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane
Abstract:
Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.Keywords: Reliability approach, storage tanks, Monte Carlo simulation, seismic acceleration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488103 An Integrated Software Architecture for Bandwidth Adaptive Video Streaming
Authors: T. Arsan
Abstract:
Video streaming over lossy IP networks is very important issues, due to the heterogeneous structure of networks. Infrastructure of the Internet exhibits variable bandwidths, delays, congestions and time-varying packet losses. Because of variable attributes of the Internet, video streaming applications should not only have a good end-to-end transport performance but also have a robust rate control, furthermore multipath rate allocation mechanism. So for providing the video streaming service quality, some other components such as Bandwidth Estimation and Adaptive Rate Controller should be taken into consideration. This paper gives an overview of video streaming concept and bandwidth estimation tools and then introduces special architectures for bandwidth adaptive video streaming. A bandwidth estimation algorithm – pathChirp, Optimized Rate Controllers and Multipath Rate Allocation Algorithm are considered as all-in-one solution for video streaming problem. This solution is directed and optimized by a decision center which is designed for obtaining the maximum quality at the receiving side.Keywords: Adaptive Video Streaming, Bandwidth Estimation, QoS, Software Architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431102 Applying 5S Lean Technology: An Infrastructure for Continuous Process Improvement
Authors: Raid A. Al-Aomar
Abstract:
This paper presents an application of 5S lean technology to a production facility. Due to increased demand, high product variety, and a push production system, the plant has suffered from excessive wastes, unorganized workstations, and unhealthy work environment. This has translated into increased production cost, frequent delays, and low workers morale. Under such conditions, it has become difficult, if not impossible, to implement effective continuous improvement studies. Hence, the lean project is aimed at diagnosing the production process, streamlining the workflow, removing/reducing process waste, cleaning the production environment, improving plant layout, and organizing workstations. 5S lean technology is utilized for achieving project objectives. The work was a combination of both culture changes and tangible/physical changes on the shop floor. The project has drastically changed the plant and developed the infrastructure for a successful implementation of continuous improvement as well as other best practices and quality initiatives.
Keywords: 5S Technique, Continuous Improvement, Kaizen, Lean Technology, Work Methods, Work Standards
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4905101 Comparison of Different Neural Network Approaches for the Prediction of Kidney Dysfunction
Authors: Ali Hussian Ali AlTimemy, Fawzi M. Al Naima
Abstract:
This paper presents the prediction of kidney dysfunction using different neural network (NN) approaches. Self organization Maps (SOM), Probabilistic Neural Network (PNN) and Multi Layer Perceptron Neural Network (MLPNN) trained with Back Propagation Algorithm (BPA) are used in this study. Six hundred and sixty three sets of analytical laboratory tests have been collected from one of the private clinical laboratories in Baghdad. For each subject, Serum urea and Serum creatinin levels have been analyzed and tested by using clinical laboratory measurements. The collected urea and cretinine levels are then used as inputs to the three NN models in which the training process is done by different neural approaches. SOM which is a class of unsupervised network whereas PNN and BPNN are considered as class of supervised networks. These networks are used as a classifier to predict whether kidney is normal or it will have a dysfunction. The accuracy of prediction, sensitivity and specificity were found for each type of the proposed networks .We conclude that PNN gives faster and more accurate prediction of kidney dysfunction and it works as promising tool for predicting of routine kidney dysfunction from the clinical laboratory data.Keywords: Kidney Dysfunction, Prediction, SOM, PNN, BPNN, Urea and Creatinine levels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931100 Comparison of Bayesian and Regression Schemes to Model Public Health Services
Authors: Sotirios Raptis
Abstract:
Bayesian reasoning (BR) or Linear (Auto) Regression (AR/LR) can predict different sources of data using priors or other data, and can link social service demands in cohorts, while their consideration in isolation (self-prediction) may lead to service misuse ignoring the context. The paper advocates that BR with Binomial (BD), or Normal (ND) models or raw data (.D) as probabilistic updates can be compared to AR/LR to link services in Scotland and reduce cost by sharing healthcare (HC) resources. Clustering, cross-correlation, along with BR, LR, AR can better predict demand. Insurance companies and policymakers can link such services, and examples include those offered to the elderly, and low-income people, smoking-related services linked to mental health services, or epidemiological weight in children. 22 service packs are used that are published by Public Health Services (PHS) Scotland and Scottish Government (SG) from 1981 to 2019, broken into 110 year series (factors), joined using LR, AR, BR. The Primary component analysis found 11 significant factors, while C-Means (CM) clustering gave five major clusters.
Keywords: Bayesian probability, cohorts, data frames, regression, services, prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22499 A New Objective Weight on Interval Type-2 Fuzzy Sets
Authors: Nurnadiah Z., Lazim A.
Abstract:
The design of weight is one of the important parts in fuzzy decision making, as it would have a deep effect on the evaluation results. Entropy is one of the weight measure based on objective evaluation. Non--probabilistic-type entropy measures for fuzzy set and interval type-2 fuzzy sets (IT2FS) have been developed and applied to weight measure. Since the entropy for (IT2FS) for decision making yet to be explored, this paper proposes a new objective weight method by using entropy weight method for multiple attribute decision making (MADM). This paper utilizes the nature of IT2FS concept in the evaluation process to assess the attribute weight based on the credibility of data. An example was presented to demonstrate the feasibility of the new method in decision making. The entropy measure of interval type-2 fuzzy sets yield flexible judgment and could be applied in decision making environment.Keywords: Objective weight, entropy weight, multiple attributedecision making, type-2 fuzzy sets, interval type-2 fuzzy sets
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166098 Modeling and Analysis of an SIRS Epidemic Model with Effect of Awareness Programs by Media
Authors: Navjot Kaur, Mini Ghosh, S.S. Bhatia
Abstract:
This paper proposes and analyzes an SIRS epidemic model incorporating the effects of the awareness programs driven by the media. Media and media driven awareness programs play a promising role in disseminating the information about outbreak of any disease across the globe. This motivates people to take precautionary measures and guides the infected individuals to get hospitalized. Timely hospitalization helps to reduce diagnostic delays and hence results in fast recovery of infected individuals. The aim of this study is to investigate the impact of the media on the spread and control of infectious diseases. This model is analyzed using stability theory of differential equations. The sensitivity of parameters has been discussed and it has been found that the awareness programs driven by the media have positive impact in reducing the infection prevalence of the infective population in the region under consideration.
Keywords: Infectious diseases, SIRS model, Media, Stability theory, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 291397 An Effective Islanding Detection and Classification Method Using Neuro-Phase Space Technique
Authors: Aziah Khamis, H. Shareef
Abstract:
The purpose of planned islanding is to construct a power island during system disturbances which are commonly formed for maintenance purpose. However, in most of the cases island mode operation is not allowed. Therefore distributed generators (DGs) must sense the unplanned disconnection from the main grid. Passive technique is the most commonly used method for this purpose. However, it needs improvement in order to identify the islanding condition. In this paper an effective method for identification of islanding condition based on phase space and neural network techniques has been developed. The captured voltage waveforms at the coupling points of DGs are processed to extract the required features. For this purposed a method known as the phase space techniques is used. Based on extracted features, two neural network configuration namely radial basis function and probabilistic neural networks are trained to recognize the waveform class. According to the test result, the investigated technique can provide satisfactory identification of the islanding condition in the distribution system.Keywords: Classification, Islanding detection, Neural network, Phase space.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 213296 Performance Modeling and Availability Analysis of Yarn Dyeing System of a Textile Industry
Authors: P. C. Tewari, Rajiv Kumar, Dinesh Khanduja
Abstract:
This paper discusses the performance modeling and availability analysis of Yarn Dyeing System of a Textile Industry. The Textile Industry is a complex and repairable engineering system. Yarn Dyeing System of Textile Industry consists of five subsystems arranged in series configuration. For performance modeling and analysis of availability, a performance evaluating model has been developed with the help of mathematical formulation based on Markov-Birth-Death Process. The differential equations have been developed on the basis of Probabilistic Approach using a Transition Diagram. These equations have further been solved using normalizing condition in order to develop the steady state availability, a performance measure of the system concerned. The system performance has been further analyzed with the help of decision matrices. These matrices provide various availability levels for different combinations of failure and repair rates for various subsystems. The findings of this paper are therefore, considered to be useful for the analysis of availability and determination of the best possible maintenance strategies which can be implemented in future to enhance the system performance.
Keywords: Availability Analysis, Markov Process, Performance Modeling, Steady State Availability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 230295 Design and Implementation of an AI-Enabled Task Assistance and Management System
Authors: Arun Prasad Jaganathan
Abstract:
In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper presents an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.
Keywords: Artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7594 Feature-Based Summarizing and Ranking from Customer Reviews
Authors: Dim En Nyaung, Thin Lai Lai Thein
Abstract:
Due to the rapid increase of Internet, web opinion sources dynamically emerge which is useful for both potential customers and product manufacturers for prediction and decision purposes. These are the user generated contents written in natural languages and are unstructured-free-texts scheme. Therefore, opinion mining techniques become popular to automatically process customer reviews for extracting product features and user opinions expressed over them. Since customer reviews may contain both opinionated and factual sentences, a supervised machine learning technique applies for subjectivity classification to improve the mining performance. In this paper, we dedicate our work is the task of opinion summarization. Therefore, product feature and opinion extraction is critical to opinion summarization, because its effectiveness significantly affects the identification of semantic relationships. The polarity and numeric score of all the features are determined by Senti-WordNet Lexicon. The problem of opinion summarization refers how to relate the opinion words with respect to a certain feature. Probabilistic based model of supervised learning will improve the result that is more flexible and effective.
Keywords: Opinion Mining, Opinion Summarization, Sentiment Analysis, Text Mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 293393 GSA-Based Design of Dual Proportional Integral Load Frequency Controllers for Nonlinear Hydrothermal Power System
Authors: M. Elsisi, M. Soliman, M. A. S. Aboelela, W. Mansour
Abstract:
This paper considers the design of Dual Proportional- Integral (DPI) Load Frequency Control (LFC), using gravitational search algorithm (GSA). The design is carried out for nonlinear hydrothermal power system where generation rate constraint (GRC) and governor dead band are considered. Furthermore, time delays imposed by governor-turbine, thermodynamic process, and communication channels are investigated. GSA is utilized to search for optimal controller parameters by minimizing a time-domain based objective function. GSA-based DPI has been compared to Ziegler- Nichols based PI, and Genetic Algorithm (GA) based PI controllers in order to demonstrate the superior efficiency of the proposed design. Simulation results are carried for a wide range of operating conditions and system parameters variations.Keywords: Gravitational Search Algorithm (GSA), Load Frequency Control (LFC), Dual Proportional-Integral (DPI) controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 198592 Seismic Base Shear Force Depending on Building Fundamental Period and Site Conditions: Deterministic Formulation and Probabilistic Analysis
Authors: S. Dorbani, M. Badaoui, D. Benouar
Abstract:
The aim of this paper is to investigate the effect of the building fundamental period of reinforced concrete buildings of (6, 9, and 12-storey), with different floor plans: Symmetric, mono-symmetric, and unsymmetric. These structures are erected at different epicentral distances. Using the Boumerdes, Algeria (2003) earthquake data, we focused primarily on the establishment of the deterministic formulation linking the base shear force to two parameters: The first one is the fundamental period that represents the numerical fingerprint of the structure, and the second one is the epicentral distance used to represent the impact of the earthquake on this force. In a second step, with a view to highlight the effect of uncertainty in these parameters on the analyzed response, these parameters are modeled as random variables with a log-normal distribution. The variability of the coefficients of variation of the chosen uncertain parameters, on the statistics on the seismic base shear force, showed that the effect of uncertainty on fundamental period on this force statistics is low compared to the epicentral distance uncertainty influence.
Keywords: Base shear force, fundamental period, epicentral distance, uncertainty, lognormal variable, statistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 130191 Framework for Delivery Reliability in European Machinery and Equipment Industry
Authors: G. Schuh, A. Kampker, A. Hoeschen, T. Jasinski
Abstract:
Today-s manufacturing companies are facing multiple and dynamic customer-supplier-relationships embedded in nonhierarchical production networks. This complex environment leads to problems with delivery reliability and wasteful turbulences throughout the entire network. This paper describes an operational model based on a theoretical framework which improves delivery reliability of each individual customer-supplier-relationship within non-hierarchical production networks of the European machinery and equipment industry. By developing a non-centralized coordination mechanism based on determining the value of delivery reliability and derivation of an incentive system for suppliers the number of in time deliveries can be increased and thus the turbulences in the production network smoothened. Comparable to an electronic stock exchange the coordination mechanism will transform the manual and nontransparent process of determining penalties for delivery delays into an automated and transparent market mechanism creating delivery reliability.
Keywords: delivery reliability, machinery and equipmentindustry, non-hierarchical production networks, supply chainmanagement
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155890 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.
Keywords: Imperfect channel state information, outage probability, multiuser- multi input single output.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 111489 Experimental Investigation the Effectiveness of Using Heat Pipe on the Spacecraft Mockup Panel
Authors: M. Abdou, M. K. Khalil
Abstract:
The heat pipe is a thermal device which allows efficient transport of thermal energy. The experimental work of this research was split into two phases; phase 1 is the development of the facilities, material and test rig preparation. Phase 2 is the actual experiments and measurements of the thermal control mockup inside the modified vacuum chamber (MVC). Due to limited funds, the development on the thermal control subsystem was delayed and the experimental facilities such as suitable thermal vacuum chamber with space standard specifications were not available from the beginning of the research and had to be procured over a period of time. In all, these delays extended the project by one and a half year. Thermal control subsystem needs a special facility and equipment to be tested. The available vacuum chamber is not suitable for the thermal tests. Consequently, the modification of the chamber was a must. A vacuum chamber has been modified to be used as a Thermal Vaccum Chamber (TVC). A MVC is a vacuum chamber modified by using a stainless mirror box with perfect reflectability and the infrared lamp connected with the voltage regulator to vary the lamp intensity as it will be illustrated through the paper.
Keywords: Heat pipe, thermal control, thermal vacuum chamber, satellite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 64288 Convergence Analysis of Training Two-Hidden-Layer Partially Over-Parameterized ReLU Networks via Gradient Descent
Authors: Zhifeng Kong
Abstract:
Over-parameterized neural networks have attracted a great deal of attention in recent deep learning theory research, as they challenge the classic perspective of over-fitting when the model has excessive parameters and have gained empirical success in various settings. While a number of theoretical works have been presented to demystify properties of such models, the convergence properties of such models are still far from being thoroughly understood. In this work, we study the convergence properties of training two-hidden-layer partially over-parameterized fully connected networks with the Rectified Linear Unit activation via gradient descent. To our knowledge, this is the first theoretical work to understand convergence properties of deep over-parameterized networks without the equally-wide-hidden-layer assumption and other unrealistic assumptions. We provide a probabilistic lower bound of the widths of hidden layers and proved linear convergence rate of gradient descent. We also conducted experiments on synthetic and real-world datasets to validate our theory.Keywords: Over-parameterization, Rectified Linear Units (ReLU), convergence, gradient descent, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 89687 Syntactic Recognition of Distorted Patterns
Authors: Marek Skomorowski
Abstract:
In syntactic pattern recognition a pattern can be represented by a graph. Given an unknown pattern represented by a graph g, the problem of recognition is to determine if the graph g belongs to a language L(G) generated by a graph grammar G. The so-called IE graphs have been defined in [1] for a description of patterns. The IE graphs are generated by so-called ETPL(k) graph grammars defined in [1]. An efficient, parsing algorithm for ETPL(k) graph grammars for syntactic recognition of patterns represented by IE graphs has been presented in [1]. In practice, structural descriptions may contain pattern distortions, so that the assignment of a graph g, representing an unknown pattern, to a graph language L(G) generated by an ETPL(k) graph grammar G is rejected by the ETPL(k) type parsing. Therefore, there is a need for constructing effective parsing algorithms for recognition of distorted patterns. The purpose of this paper is to present a new approach to syntactic recognition of distorted patterns. To take into account all variations of a distorted pattern under study, a probabilistic description of the pattern is needed. A random IE graph approach is proposed here for such a description ([2]).Keywords: Syntactic pattern recognition, Distorted patterns, Random graphs, Graph grammars.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139586 Structural Reliability of Existing Structures: A Case Study
Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol
Abstract:
reliability-based methodology for the assessment and evaluation of reinforced concrete (R/C) structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for R/C structural elements were verified by the results obtained through deterministic methods. The outcomes of the reliability-based analysis were compared against currently adopted safety limits that are incorporated in the reliability indices β’s, according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) associated with the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the R/C elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.
Keywords: Concrete Structures, FORM, Monte Carlo Simulation, Structural Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 309185 Fault Diagnosis of Nonlinear Systems Using Dynamic Neural Networks
Authors: E. Sobhani-Tehrani, K. Khorasani, N. Meskin
Abstract:
This paper presents a novel integrated hybrid approach for fault diagnosis (FD) of nonlinear systems. Unlike most FD techniques, the proposed solution simultaneously accomplishes fault detection, isolation, and identification (FDII) within a unified diagnostic module. At the core of this solution is a bank of adaptive neural parameter estimators (NPE) associated with a set of singleparameter fault models. The NPEs continuously estimate unknown fault parameters (FP) that are indicators of faults in the system. Two NPE structures including series-parallel and parallel are developed with their exclusive set of desirable attributes. The parallel scheme is extremely robust to measurement noise and possesses a simpler, yet more solid, fault isolation logic. On the contrary, the series-parallel scheme displays short FD delays and is robust to closed-loop system transients due to changes in control commands. Finally, a fault tolerant observer (FTO) is designed to extend the capability of the NPEs to systems with partial-state measurement.
Keywords: Hybrid fault diagnosis, Dynamic neural networks, Nonlinear systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 222184 Review and Classification of the Indicators and Trends Used in Bridge Performance Modeling
Authors: S. Rezaei, Z. Mirzaei, M. Khalighi, J. Bahrami
Abstract:
Bridges, as an essential part of road infrastructures, are affected by various deterioration mechanisms over time due to the changes in their performance. As changes in performance can have many negative impacts on society, it is essential to be able to evaluate and measure the performance of bridges throughout their life. This evaluation includes the development or the choice of the appropriate performance indicators, which, in turn, are measured based on the selection of appropriate models for the existing deterioration mechanism. The purpose of this article is a statistical study of indicators and deterioration mechanisms of bridges in order to discover further research capacities in bridges performance assessment. For this purpose, some of the most common indicators of bridge performance, including reliability, risk, vulnerability, robustness, and resilience, were selected. The researches performed on each index based on the desired deterioration mechanisms and hazards were comprehensively reviewed. In addition, the formulation of the indicators and their relationship with each other were studied. The research conducted on the mentioned indicators were classified from the point of view of deterministic or probabilistic method, the level of study (element level, object level, etc.), and the type of hazard and the deterioration mechanism of interest. For each of the indicators, a number of challenges and recommendations were presented according to the review of previous studies.
Keywords: Bridge, deterioration mechanism, lifecycle, performance indicator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45983 Renovation Planning Model for a Shopping Mall
Authors: Hsin-Yun Lee
Abstract:
In this study, the pedestrian simulation VISWALK integration and application platform ant algorithms written program made to construct a renovation engineering schedule planning mode. The use of simulation analysis platform construction site when the user running the simulation, after calculating the user walks in the case of construction delays, the ant algorithm to find out the minimum delay time schedule plan, and add volume and unit area deactivated loss of business computing, and finally to the owners and users of two different positions cut considerations pick out the best schedule planning. To assess and validate its effectiveness, this study constructed the model imported floor of a shopping mall floor renovation engineering cases. Verify that the case can be found from the mode of the proposed project schedule planning program can effectively reduce the delay time and the user's walking mall loss of business, the impact of the operation on the renovation engineering facilities in the building to a minimum.Keywords: Pedestrian, renovation, schedule, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2331