Search results for: proportional hazard model
15668 Study on Constitutive Model of Particle Filling Material Considering Volume Expansion
Authors: Xu Jinsheng, Tong Xin, Zheng Jian, Zhou Changsheng
Abstract:
The NEPE (nitrate ester plasticized polyether) propellant is a kind of particle filling material with relatively high filling fraction. The experimental results show that the microcracks, microvoids and dewetting can cause the stress softening of the material. In this paper, a series of mechanical testing in inclusion with CCD technique were conducted to analyze the evolution of internal defects of propellant. The volume expansion function of the particle filling material was established by measuring of longitudinal and transverse strain with optical deformation measurement system. By analyzing the defects and internal damages of the material, a visco-hyperelastic constitutive model based on free energy theory was proposed incorporating damage function. The proposed constitutive model could accurately predict the mechanical properties of uniaxial tensile tests and tensile-relaxation tests.Keywords: dewetting, constitutive model, uniaxial tensile tests, visco-hyperelastic, nonlinear
Procedia PDF Downloads 30415667 Integrated Evaluation of Green Design and Green Manufacturing Processes Using a Mathematical Model
Authors: Yuan-Jye Tseng, Shin-Han Lin
Abstract:
In this research, a mathematical model for integrated evaluation of green design and green manufacturing processes is presented. To design a product, there can be alternative options to design the detailed components to fulfill the same product requirement. In the design alternative cases, the components of the product can be designed with different materials and detailed specifications. If several design alternative cases are proposed, the different materials and specifications can affect the manufacturing processes. In this paper, a new concept for integrating green design and green manufacturing processes is presented. A green design can be determined based the manufacturing processes of the designed product by evaluating the green criteria including energy usage and environmental impact, in addition to the traditional criteria of manufacturing cost. With this concept, a mathematical model is developed to find the green design and the associated green manufacturing processes. In the mathematical model, the cost items include material cost, manufacturing cost, and green related cost. The green related cost items include energy cost and environmental cost. The objective is to find the decisions of green design and green manufacturing processes to achieve the minimized total cost. In practical applications, the decision-making can be made to select a good green design case and its green manufacturing processes. In this presentation, an example product is illustrated. It shows that the model is practical and useful for integrated evaluation of green design and green manufacturing processes.Keywords: supply chain management, green supply chain, green design, green manufacturing, mathematical model
Procedia PDF Downloads 80915666 Facility Data Model as Integration and Interoperability Platform
Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes
Abstract:
Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.Keywords: airport ontology, energy management, facility data model, ontology modeling
Procedia PDF Downloads 45115665 Global Stability Analysis of a Coupled Model for Healthy and Cancerous Cells Dynamics in Acute Myeloid Leukemia
Authors: Abdelhafid Zenati, Mohamed Tadjine
Abstract:
The mathematical formulation of biomedical problems is an important phase to understand and predict the dynamic of the controlled population. In this paper we perform a stability analysis of a coupled model for healthy and cancerous cells dynamics in Acute Myeloid Leukemia, this represents our first aim. Second, we illustrate the effect of the interconnection between healthy and cancer cells. The PDE-based model is transformed to a nonlinear distributed state space model (delay system). For an equilibrium point of interest, necessary and sufficient conditions of global asymptotic stability are given. Thus, we came up to give necessary and sufficient conditions of global asymptotic stability of the origin and the healthy situation and control of the dynamics of normal hematopoietic stem cells and cancerous during myelode Acute leukemia. Simulation studies are given to illustrate the developed results.Keywords: distributed delay, global stability, modelling, nonlinear models, PDE, state space
Procedia PDF Downloads 25315664 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 43115663 Study of the Relationship between the Roughness Configuration of Channel Bottom and the Creation of Vortices at the Rough Area: Numerical Modelling
Authors: Youb Said, Fourar Ali
Abstract:
To describe the influence of bottom roughness on the free surface flows by numerical modeling, a two-dimensional model was developed. The equations of continuity and momentum (Naviers Stokes equations) are solved by the finite volume method. We considered a turbulent flow in an open channel with a bottom roughness. For our simulations, the K-ε model was used. After setting the initial and boundary conditions and solve the equations set, we were able to achieve the following results: vortex forming in the hollow causing substantial energy dissipation in the obstacle areas that form the bottom roughness. The comparison of our results with experimental ones shows a good agreement in terms of the results in the rough area. However, in other areas, differences were more or less important. These differences are in areas far from the bottom, especially the free surface area just after the bottom. These disagreements are probably due to experimental constants used by the k-ε model.Keywords: modeling, free surface flow, turbulence, bottom roughness, finite volume, K-ε model, energy dissipation
Procedia PDF Downloads 38215662 Impact of Compost Application with Different Rates of Chemical Fertilizers on Corn Growth and Production
Authors: Reda Abdel-Aziz
Abstract:
Agricultural activities in Egypt generate annually around 35 million tons of waste. Composting is one of the most promising technologies to turnover waste in a more economical way, for many centuries. Composting has been used as a mean of recycling organic matter back into the soil to improve soil structure and fertility. Field experiments were conducted in two governorates, Giza and Al-Monofia, to find out the effect of compost with different rates of chemical fertilizers on growth and yield of corn (Zea mays L.) during two constitutive seasons of 2012 and 2013. The experiment, laid out in a randomized complete block design (RCBD), was carried out on five farmers’ fields in each governorate. The treatments were: unfertilized control, full dose of NPK (120, 30, and 50 kg/acre, respectively), compost at rate of 20 ton/acre, compost at rate of 10 ton/acre + 25% of chemical fertilizer, compost at rate of 10 ton/acre + 50% of chemical fertilizer and compost at rate of 10 ton/acre + 75% of chemical fertilizer. Results revealed a superiority of the treatment of compost at rate of 10 ton/acre + 50% of NPK that caused significant improvement in growth, yield and nutrient uptakes of corn in the two governorates during the two constitutive seasons. Results showed that agricultural waste could be composted into value added soil amendment to enhance efficiency of chemical fertilizer. Composting of agricultural waste could also reduce the chemical fertilizers potential hazard to the environment.Keywords: agricultural waste, compost, chemical fertilizers, corn production, environment
Procedia PDF Downloads 32015661 Using of Particle Swarm Optimization for Loss Minimization of Vector-Controlled Induction Motors
Authors: V. Rashtchi, H. Bizhani, F. R. Tatari
Abstract:
This paper presents a new online loss minimization for an induction motor drive. Among the many loss minimization algorithms (LMAs) for an induction motor, a particle swarm optimization (PSO) has the advantages of fast response and high accuracy. However, the performance of the PSO and other optimization algorithms depend on the accuracy of the modeling of the motor drive and losses. In the development of the loss model, there is always a trade off between accuracy and complexity. This paper presents a new online optimization to determine an optimum flux level for the efficiency optimization of the vector-controlled induction motor drive. An induction motor (IM) model in d-q coordinates is referenced to the rotor magnetizing current. This transformation results in no leakage inductance on the rotor side, thus the decomposition into d-q components in the steady-state motor model can be utilized in deriving the motor loss model. The suggested algorithm is simple for implementation.Keywords: induction machine, loss minimization, magnetizing current, particle swarm optimization
Procedia PDF Downloads 63415660 Estimation of the Length and Location of Ground Surface Deformation Caused by the Reverse Faulting
Authors: Nader Khalafian, Mohsen Ghaderi
Abstract:
Field observations have revealed many examples of structures which were damaged due to ground surface deformation caused by the faulting phenomena. In this paper some efforts were made in order to estimate the length and location of the ground surface where large displacements were created due to the reverse faulting. This research has conducted in two steps; (1) in the first step, a 2D explicit finite element model were developed using ABAQUS software. A subroutine for Mohr-Coulomb failure criterion with strain softening model was developed by the authors in order to properly model the stress strain behavior of the soil in the fault rapture zone. The results of the numerical analysis were verified with the results of available centrifuge experiments. Reasonable coincidence was found between the numerical and experimental data. (2) In the second step, the effects of the fault dip angle (δ), depth of soil layer (H), dilation and friction angle of sand (ψ and φ) and the amount of fault offset (d) on the soil surface displacement and fault rupture path were investigated. An artificial neural network-based model (ANN), as a powerful prediction tool, was developed to generate a general model for predicting faulting characteristics. A properly sized database was created to train and test network. It was found that the length and location of the zone of displaced ground surface can be accurately estimated using the proposed model.Keywords: reverse faulting, surface deformation, numerical, neural network
Procedia PDF Downloads 42215659 Information Requirements for Vessel Traffic Service Operations
Authors: Fan Li, Chun-Hsien Chen, Li Pheng Khoo
Abstract:
Operators of vessel traffic service (VTS) center provides three different types of services; namely information service, navigational assistance and traffic organization to vessels. To provide these services, operators monitor vessel traffic through computer interface and provide navigational advice based on the information integrated from multiple sources, including automatic identification system (AIS), radar system, and closed circuit television (CCTV) system. Therefore, this information is crucial in VTS operation. However, what information the VTS operator actually need to efficiently and properly offer services is unclear. The aim of this study is to investigate into information requirements for VTS operation. To achieve this aim, field observation was carried out to elicit the information requirements for VTS operation. The study revealed that the most frequent and important tasks were handling arrival vessel report, potential conflict control and abeam vessel report. Current location and vessel name were used in all tasks. Hazard cargo information was particularly required when operators handle arrival vessel report. The speed, the course, and the distance of two or several vessels were only used in potential conflict control. The information requirements identified in this study can be utilized in designing a human-computer interface that takes into consideration what and when information should be displayed, and might be further used to build the foundation of a decision support system for VTS.Keywords: vessel traffic service, information requirements, hierarchy task analysis, field observation
Procedia PDF Downloads 25215658 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image
Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa
Abstract:
A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever
Procedia PDF Downloads 12215657 Developing Measurement Instruments for Enterprise Resources Planning (ERP) Post-Implementation Failure Model
Authors: Malihe Motiei, Nor Hidayati Zakaria, Davide Aloini
Abstract:
This study aims to present a method to develop the failure measurement model for ERP post-implementation. To achieve this outcome, the study firstly evaluates the suitability of Technology-Organization-Environment framework for the proposed conceptual model. This study explains how to discover the constructs and subsequently to design and evaluate the constructs as formative or reflective. Constructs are used including reflective and purely formative. Then, the risk dimensions are investigated to determine the instruments to examine the impact of risk on ERP failure after implementation. Two construct as formative constructs consist inadequate implementation and poor organizational decision making. Subsequently six construct as reflective construct include technical risks, operational risks, managerial risks, top management risks, lack of external risks, and user’s inefficiency risks. A survey was conducted among Iranian industries to collect data. 69 data were collected from manufacturing sectors and the data were analyzed by Smart PLS software. The results indicated that all measurements included 39 critical risk factors were acceptable for the ERP post-implementation failure model.Keywords: critical risk factors (CRFs), ERP projects, ERP post-implementation, measurement instruments, ERP system failure measurement model
Procedia PDF Downloads 36615656 The Methodology of System Modeling of Mechatronic Systems
Authors: Lakhoua Najeh
Abstract:
Aims of the work: After a presentation of the functionality of an example of a mechatronic system which is a paint mixer system, we present the concepts of modeling and safe operation. This paper briefly discusses how to model and protect the functioning of a mechatronic system relying mainly on functional analysis and safe operation techniques. Methods: For the study of an example of a mechatronic system, we use methods for external functional analysis that illustrate the relationships between a mechatronic system and its external environment. Thus, we present the Safe-Structured Analysis Design Technique method (Safe-SADT) which allows the representation of a mechatronic system. A model of operating safety and automation is proposed. This model enables us to use a functional analysis technique of the mechatronic system based on the GRAFCET (Graphe Fonctionnel de Commande des Etapes et Transitions: Step Transition Function Chart) method; study of the safe operation of the mechatronic system based on the Safe-SADT method; automation of the mechatronic system based on a software tool. Results: The expected results are to propose a model and safe operation of a mechatronic system. This methodology enables us to analyze the relevance of the different models based on Safe-SADT and GRAFCET in relation to the control and monitoring functions and to study the means allowing exploiting their synergy. Conclusion: In order to propose a general model of a mechatronic system, a model of analysis, safety operation and automation of a mechatronic system has been developed. This is how we propose to validate this methodology through a case study of a paint mixer system.Keywords: mechatronic systems, system modeling, safe operation, Safe-SADT
Procedia PDF Downloads 24615655 Sensitive Analysis of the ZF Model for ABC Multi Criteria Inventory Classification
Authors: Makram Ben Jeddou
Abstract:
The ABC classification is widely used by managers for inventory control. The classical ABC classification is based on the Pareto principle and according to the criterion of the annual use value only. Single criterion classification is often insufficient for a closely inventory control. Multi-criteria inventory classification models have been proposed by researchers in order to take into account other important criteria. From these models, we will consider the ZF model in order to make a sensitive analysis on the composite score calculated for each item. In fact, this score based on a normalized average between a good and a bad optimized index can affect the ABC items classification. We will then focus on the weights assigned to each index and propose a classification compromise.Keywords: ABC classification, multi criteria inventory classification models, ZF-model
Procedia PDF Downloads 50915654 An Economic Order Quantity Model for Deteriorating Items with Ramp Type Demand, Time Dependent Holding Cost and Price Discount Offered on Backorders
Authors: Arjun Paul, Adrijit Goswami
Abstract:
In our present work, an economic order quantity inventory model with shortages is developed where holding cost is expressed as linearly increasing function of time and demand rate is a ramp type function of time. The items considered in the model are deteriorating in nature so that a small fraction of the items is depleted with the passage of time. In order to consider a more realistic situation, the deterioration rate is assumed to follow a continuous uniform distribution with the parameters involved being triangular fuzzy numbers. The inventory manager offers his customer a discount in case he is willing to backorder his demand when there is a stock-out. The optimum ordering policy and the optimum discount offered for each backorder are determined by minimizing the total cost in a replenishment interval. For better illustration of our proposed model in both the crisp and fuzzy sense and for providing richer insights, a numerical example is cited to exemplify the policy and to analyze the sensitivity of the model parameters.Keywords: fuzzy deterioration rate, price discount on backorder, ramp type demand, shortage, time varying holding cost
Procedia PDF Downloads 19915653 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard
Authors: Arash Gharibi
Abstract:
Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.Keywords: metamodel, modeling, interoperability, reuse
Procedia PDF Downloads 19915652 Optical and Double Folding Model Analysis for Alpha Particles Elastically Scattered from 9Be and 11B Nuclei at Different Energies
Authors: Ahmed H. Amer, A. Amar, Sh. Hamada, I. I. Bondouk, F. A. El-Hussiny
Abstract:
Elastic scattering of α-particles from 9Be and 11B nuclei at different alpha energies have been analyzed. Optical model parameters (OMPs) of α-particles elastic scattering by these nuclei at different energies have been obtained. In the present calculations, the real part of the optical potential are derived by folding of nucleon-nucleon (NN) interaction into nuclear matter density distribution of the projectile and target nuclei using computer code FRESCO. A density-dependent version of the M3Y interaction (CDM3Y6), which is based on the G-matrix elements of the Paris NN potential, has been used. Volumetric integrals of the real and imaginary potential depth (JR, JW) have been calculated and found to be energy dependent. Good agreement between the experimental data and the theoretical predictions in the whole angular range. In double folding (DF) calculations, the obtained normalization coefficient Nr is in the range 0.70–1.32.Keywords: elastic scattering, optical model, double folding model, density distribution
Procedia PDF Downloads 29115651 Comparison of Solar Radiation Models
Authors: O. Behar, A. Khellaf, K. Mohammedi, S. Ait Kaci
Abstract:
Up to now, most validation studies have been based on the MBE and RMSE, and therefore, focused only on long and short terms performance to test and classify solar radiation models. This traditional analysis does not take into account the quality of modeling and linearity. In our analysis we have tested 22 solar radiation models that are capable to provide instantaneous direct and global radiation at any given location Worldwide. We introduce a new indicator, which we named Global Accuracy Indicator (GAI) to examine the linear relationship between the measured and predicted values and the quality of modeling in addition to long and short terms performance. Note that the quality of model has been represented by the T-Statistical test, the model linearity has been given by the correlation coefficient and the long and short term performance have been respectively known by the MBE and RMSE. An important founding of this research is that the use GAI allows avoiding default validation when using traditional methodology that might results in erroneous prediction of solar power conversion systems performances.Keywords: solar radiation model, parametric model, performance analysis, Global Accuracy Indicator (GAI)
Procedia PDF Downloads 35515650 Text-to-Speech in Azerbaijani Language via Transfer Learning in a Low Resource Environment
Authors: Dzhavidan Zeinalov, Bugra Sen, Firangiz Aslanova
Abstract:
Most text-to-speech models cannot operate well in low-resource languages and require a great amount of high-quality training data to be considered good enough. Yet, with the improvements made in ASR systems, it is now much easier than ever to collect data for the design of custom text-to-speech models. In this work, our work on using the ASR model to collect data to build a viable text-to-speech system for one of the leading financial institutions of Azerbaijan will be outlined. NVIDIA’s implementation of the Tacotron 2 model was utilized along with the HiFiGAN vocoder. As for the training, the model was first trained with high-quality audio data collected from the Internet, then fine-tuned on the bank’s single speaker call center data. The results were then evaluated by 50 different listeners and got a mean opinion score of 4.17, displaying that our method is indeed viable. With this, we have successfully designed the first text-to-speech model in Azerbaijani and publicly shared 12 hours of audiobook data for everyone to use.Keywords: Azerbaijani language, HiFiGAN, Tacotron 2, text-to-speech, transfer learning, whisper
Procedia PDF Downloads 4715649 An Evaluation Model for Enhancing Flexibility in Production Systems through Additive Manufacturing
Authors: Angela Luft, Sebastian Bremen, Nicolae Balc
Abstract:
Additive manufacturing processes have entered large parts of the industry and their range of application have progressed and grown significantly in the course of time. A major advantage of additive manufacturing is the innate flexibility of the machines. This corelates with the ongoing demand of creating highly flexible production environments. However, the potential of additive manufacturing technologies to enhance the flexibility of production systems has not yet been truly considered and quantified in a systematic way. In order to determine the potential of additive manufacturing technologies with regards to the strategic flexibility design in production systems, an integrated evaluation model has been developed, that allows for the simultaneous consideration of both conventional as well as additive production resources. With the described model, an operational scope of action can be identified and quantified in terms of mix and volume flexibility, process complexity, and machine capacity that goes beyond the current cost-oriented approaches and offers a much broader and more holistic view on the potential of additive manufacturing. A respective evaluation model is presented this paper.Keywords: additive manufacturing, capacity planning, production systems, strategic production planning, flexibility enhancement
Procedia PDF Downloads 15915648 Groundwater Level Modelling by ARMA and PARMA Models (Case Study: Qorveh Aquifer)
Authors: Motalleb Byzedi, Seyedeh Chaman Naderi Korvandan
Abstract:
Regarding annual statistics of groundwater level resources about current piezometers at Qorveh plains, both ARMA & PARMA modeling methods were applied in this study by the using of SAMS software. Upon performing required tests, a model was used with minimum amount of Akaike information criteria and suitable model was selected for piezometers. Then it was possible to make necessary estimations by using these models for future fluctuations in each piezometer. According to the results, ARMA model had more facilities for modeling of aquifer. Also it was cleared that eastern parts of aquifer had more failures than other parts. Therefore it is necessary to prohibit critical parts along with more supervision on taking rates of wells.Keywords: qorveh plain, groundwater level, ARMA, PARMA
Procedia PDF Downloads 28815647 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects
Authors: Sami Mestiri, Abdeljelil Farhat
Abstract:
The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC
Procedia PDF Downloads 54215646 The Magnitude Scale Evaluation of Cross-Platform Internet Public Opinion
Abstract:
This paper introduces a model of internet public opinion waves, which describes the message propagation and measures the influence of a detected event. We collect data on public opinion propagation from different platforms on the internet, including micro-blogs and news. Then, we compare the spread of public opinion to the seismic waves and correspondently define the P-wave and S-wave and other essential attributes and characteristics in the process. Further, a model is established to evaluate the magnitude scale of the events. In the end, a practical example is used to analyze the influence of network public opinion and test the reasonability and effectiveness of the proposed model.Keywords: internet public opinion waves (IPOW), magnitude scale, cross-platform, information propagation
Procedia PDF Downloads 29015645 Degradation Model for UK Railway Drainage System
Authors: Yiqi Wu, Simon Tait, Andrew Nichols
Abstract:
Management of UK railway drainage assets is challenging due to the large amounts of historical assets with long asset life cycles. A major concern for asset managers is to maintain the required performance economically and efficiently while complying with the relevant regulation and legislation. As the majority of the drainage assets are buried underground and are often difficult or costly to examine, it is important for asset managers to understand and model the degradation process in order to foresee the upcoming reduction in asset performance and conduct proactive maintenance accordingly. In this research, a Markov chain approach is used to model the deterioration process of rail drainage assets. The study is based on historical condition scores and characteristics of drainage assets across the whole railway network in England, Scotland, and Wales. The model is used to examine the effect of various characteristics on the probabilities of degradation, for example, the regional difference in probabilities of degradation, and how material and shape can influence the deterioration process for chambers, channels, and pipes.Keywords: deterioration, degradation, markov models, probability, railway drainage
Procedia PDF Downloads 22615644 Performance Assessment of Recycled Alum Sludge in the Treatment of Textile Industry Effluent in South Africa
Authors: Tony Ngoy Mbodi, Christophe Muanda
Abstract:
Textile industry is considered as one of the most polluting sectors in terms of effluent volume of discharge and wastewater composition, such as dye, which represents an environmental hazard when discharged without any proper treatment. A study was conducted to investigate the capability of the use of recycled alum sludge (RAS) as an alternative treatment for the reduction of colour, chemical oxygen demand (COD), total dissolved solids (TDS) and pH adjustment from dye based synthetic textile industry wastewater. The coagulation/flocculation process was studied for coagulants of Alum:RAS ratio of, 1:1, 2:1, 1:2 and 0:1. Experiments on treating the synthetic wastewater using membrane filtration and adsorption with corn cobs were also conducted. Results from the coagulation experiment were compared to those from adsorption with corn cobs and membrane filtration experiments conducted on the same synthetic wastewater. The results of the RAS experiments were also evaluated against standard guidelines for industrial effluents treated for discharge purposes in order to establish its level of compliance. Based on current results, it can be concluded that reusing the alum sludge as a low-cost material pretreatment method into the coagulation/flocculation process can offer some advantages such as high removal efficiency for disperse dye and economic savings on overall treatment of the industry wastewater.Keywords: alum, coagulation/flocculation, dye, recycled alum sludge, textile wastewater
Procedia PDF Downloads 35615643 The Systems Theoretic Accident Model and Process (Stamp) as the New Trend to Promote Safety Culture in Construction
Authors: Natalia Ortega
Abstract:
Safety Culture (SCU) involves various perceptual, psychological, behavioral, and managerial factors. It has been shown that creating and maintaining an SCU is one way to reduce and prevent accidents and fatalities. In the construction sector, safety attitude, knowledge, and a supportive environment are predictors of safety behavior. The highest possible proportion of safety behavior among employees can be achieved by improving their safety attitude and knowledge. Accordingly, top management's commitment to safety is vital in shaping employees' safety attitude; therefore, the first step to improving employees' safety attitude is the genuine commitment of top management to safety. One of the factors affecting the successful implementation of health and safety promotion programs is the construction industry's subcontracting model. The contractual model's complexity, combined with the need for coordination among diverse stakeholders, makes it challenging to implement, manage, and follow up on health and well-being initiatives. The Systems theoretic accident model and process (STAMP) concept has expanded global consideration in recent years, increasing research attention. STAMP focuses attention on the role of constraints in safety management. The findings discover a growth of the research field from the definition in 2004 by Leveson and is being used across multiple domains. A systematic literature review of this novel model aims to meet the safety goals for human space exploration with a powerful and different approach to safety management, safety-driven design, and decision-making. Around two hundred studies have been published about applying the model. However, every single model for safety requires time to transform into research and practice, be tested and debated, and grow further and mature.Keywords: stamp, risk management, accident prevention, safety culture, systems thinking, construction industry, safety
Procedia PDF Downloads 8315642 BIM Model and Virtual Prototyping in Construction Management
Authors: Samar Alkindy
Abstract:
Purpose: The BIM model has been used to support the planning of different construction projects in the industry by showing the different stages of the construction process. The model has been instrumental in identifying some of the common errors in the construction process through the spatial arrangement. The continuous use of the BIM model in the construction industry has resulted in various radical changes such as virtual prototyping. Construction virtual prototyping is a highly advanced technology that incorporates a BIM model with realistic graphical simulations, and facilitates the simulation of the project before a product is built in the factory. The paper presents virtual prototyping in the construction industry by examining its application, challenges and benefits to a construction project. Methodology approach: A case study was conducted for this study in four major construction projects, which incorporate virtual construction prototyping in several stages of the construction project. Furthermore, there was the administration of interviews with the project manager and engineer and the planning manager. Findings: Data collected from the methodological approach shows a positive response for virtual construction prototyping in construction, especially concerning communication and visualization. Furthermore, the use of virtual prototyping has increased collaboration and efficiency between construction experts handling a project. During the planning stage, virtual prototyping has increased accuracy, reduced planning time, and reduced the amount of rework during the implementation stage. Irrespective of virtual prototyping being a new concept in the construction industry, the findings outline that the approach will benefit the management of construction projects.Keywords: construction operations, construction planning, process simulation, virtual prototyping
Procedia PDF Downloads 23315641 Social Receptiveness of Tourists in the Batumi Population
Authors: Megi Surmanidze
Abstract:
The relations between tourists and the population is essentially important to develop tourism. This global branch is an incoming stream and plays the huge economy role in the country is development. This branch is important for adjarian region as well as for Batumi as a touristic city. When lots of tourists visit our city, the relationship and attitudes between tourists and the local population is too important and play the determinant role whether they will visit the country again or not. Receptivness is on actual accompanying process typical for tourism as a growing incoming business direction, and its necessary to be studied the problems and touristic relationships. The aim of the article is to show the importance of receptibility in the business touristic industry also, the topic was to show the dimension scale of receptibility revearing in Batumi population. The topic itself is of hight importance as the touristic business is growing day by day, though the helpful and disturbing social couses were left out of focus were dencely connected to the relations between tourists and the local population. The value is real and directly proportional to the relationship between tourists and the population. The qualitative social research was fulfield. The method to get information was deepened interview. the method gave us the opportunity to get acquented to the actors points of view, also it was suitable points of view, also it was suitable to create favourable conditions to respodents be sincere during interviev and don’t hide their real emotions and the opinions.Keywords: tourism industry, receptiveness, cultural identity, xenophobia
Procedia PDF Downloads 11015640 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques
Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo
Abstract:
Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.Keywords: air pollution, air quality modelling, data mining, particulate matter
Procedia PDF Downloads 25915639 Epistemic Uncertainty Analysis of Queue with Vacations
Authors: Baya Takhedmit, Karim Abbas, Sofiane Ouazine
Abstract:
The vacations queues are often employed to model many real situations such as computer systems, communication networks, manufacturing and production systems, transportation systems and so forth. These queueing models are solved at fixed parameters values. However, the parameter values themselves are determined from a finite number of observations and hence have uncertainty associated with them (epistemic uncertainty). In this paper, we consider the M/G/1/N queue with server vacation and exhaustive discipline where we assume that the vacation parameter values have uncertainty. We use the Taylor series expansions approach to estimate the expectation and variance of model output, due to epistemic uncertainties in the model input parameters.Keywords: epistemic uncertainty, M/G/1/N queue with vacations, non-parametric sensitivity analysis, Taylor series expansion
Procedia PDF Downloads 435