Search results for: Gaussian process classification model with multiclass
11081 Processes Simulation Study of Coal to Methanol Based on Gasification Technology
Authors: Po-Chuang Chen, Hsiu-Mei Chiu, Yau-Pin Chyou, Chiou-Shia Yu
Abstract:
This study presents a simulation model for converting coal to methanol, based on gasification technology with the commercial chemical process simulator, Pro/II® V8.1.1. The methanol plant consists of air separation unit (ASU), gasification unit, gas clean-up unit, and methanol synthetic unit. The clean syngas is produced with the first three operating units, and the model has been verified with the reference data from United States Environment Protection Agency. The liquid phase methanol (LPMEOHTM) process is adopted in the methanol synthetic unit. Clean syngas goes through gas handing section to reach the reaction requirement, reactor loop/catalyst to generate methanol, and methanol distillation to get desired purity over 99.9 wt%. The ratio of the total energy combined with methanol and dimethyl ether to that of feed coal is 78.5% (gross efficiency). The net efficiency is 64.2% with the internal power consumption taken into account, based on the assumption that the efficiency of electricity generation is 40%.
Keywords: Gasification, Methanol, LPMEOH, System-levelsimulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 535911080 Developing a Model for the Relation between Heritage and Place Identity
Authors: A. Arjomand Kermani, N. Charbgoo, M. Alalhesabi
Abstract:
In the situation of great acceleration of changes and the need for new developments in the cities on one hand and conservation and regeneration approaches on the other hand, place identity and its relation with heritage context have taken on new importance. This relation is generally mutual and complex one. The significant point in this relation is that the process of identifying something as heritage rather than just historical phenomena, brings that which may be inherited into the realm of identity. In planning and urban design as well as environmental psychology and phenomenology domain, place identity and its attributes and components were studied and discussed. However, the relation between physical environment (especially heritage) and identity has been neglected in the planning literature. This article aims to review the knowledge on this field and develop a model on the influence and relation of these two major concepts (heritage and identity). To build this conceptual model, we draw on available literature in environmental psychology as well as planning on place identity and heritage environment using a descriptive-analytical methodology to understand how they can inform the planning strategies and governance policies. A cross-disciplinary analysis is essential to understand the nature of place identity and heritage context and develop a more holistic model of their relationship in order to be employed in planning process and decision making. Moreover, this broader and more holistic perspective would enable both social scientists and planners to learn from one another’s expertise for a fuller understanding of community dynamics. The result indicates that a combination of these perspectives can provide a richer understanding—not only of how planning impacts our experience of place, but also how place identity can impact community planning and development.
Keywords: heritage, Inter-disciplinary study, Place identity, planning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 191211079 User-Driven Product Line Engineering for Assembling Large Families of Software
Authors: Zhaopeng Xuan, Yuan Bian, C. Cailleaux, Jing Qin, S. Traore
Abstract:
Traditional software engineering allows engineers to propose to their clients multiple specialized software distributions assembled from a shared set of software assets. The management of these assets however requires a trade-off between client satisfaction and software engineering process. Clients have more and more difficult to find a distribution or components based on their needs from all of distributed repositories.
This paper proposes a software engineering for a user-driven software product line in which engineers define a Feature Model but users drive the actual software distribution on demand. This approach makes the user become final actor as a release manager in software engineering process, increasing user product satisfaction and simplifying user operations to find required components. In addition, it provides a way for engineers to manage and assembly large software families.
As a proof of concept, a user-driven software product line is implemented for Eclipse, an integrated development environment. An Eclipse feature model is defined, which is exposed to users on a cloud-based built platform from which clients can download individualized Eclipse distributions.
Keywords: Software Product Line, Model-driven Development, Reverse Engineering and Refactoring, Agile Method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 183111078 Comparison of Material Constitutive Models Used in FEA of Low Volume Roads
Authors: Lenka Ševelová, Aleš Florian
Abstract:
Appropriate and progressive tool for analyzing behavior of low volume roads are probabilistic models used in reliability analyses. The necessary part of the probabilistic model is the deterministic model of structural behavior. The FE model of low volume roads is created in the ANSYS software. It is able to determine the state of stress and deformation in any point of the structure and thus generate data required for the reliability analysis. The paper compares two material constitutive models used for modeling of unbound non-homogenous materials used in low volume roads. The first model is linear elastic model according to Hook theory (H model), the second one is nonlinear elastic-plastic Drucker-Prager model (D-P model).
Keywords: FEA, FEM, geotechnical materials, low volume roads, material constitutive models, pavement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 288711077 Attention Multiple Instance Learning for Cancer Tissue Classification in Digital Histopathology Images
Authors: Afaf Alharbi, Qianni Zhang
Abstract:
The identification of malignant tissue in histopathological slides holds significant importance in both clinical settings and pathology research. This paper presents a methodology aimed at automatically categorizing cancerous tissue through the utilization of a multiple instance learning framework. This framework is specifically developed to acquire knowledge of the Bernoulli distribution of the bag label probability by employing neural networks. Furthermore, we put forward a neural network-based permutation-invariant aggregation operator, equivalent to attention mechanisms, which is applied to the multi-instance learning network. Through empirical evaluation on an openly available colon cancer histopathology dataset, we provide evidence that our approach surpasses various conventional deep learning methods.
Keywords: Attention Multiple Instance Learning, Multiple Instance Learning, transfer learning, histopathological slides, cancer tissue classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22211076 Algorithm Design and Performance Evaluation of Equivalent CMOS Model
Authors: Parvinder S. Sandhu, Iqbaldeep Kaur, Amit Verma, Inderpreet Kaur, Birinderjit S. Kalyan
Abstract:
This work is a proposed model of CMOS for which the algorithm has been created and then the performance evaluation of this proposition has been done. In this context, another commonly used model called ZSTT (Zero Switching Time Transient) model is chosen to compare all the vital features and the results for the Proposed Equivalent CMOS are promising. In the end, the excerpts of the created algorithm are also includedKeywords: Dual Capacitor Model, ZSTT, CMOS, SPICEMacro-Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 133111075 Adaptive Kaman Filter for Fault Diagnosis of Linear Parameter-Varying Systems
Authors: Rajamani Doraiswami, Lahouari Cheded
Abstract:
Fault diagnosis of Linear Parameter-Varying (LPV) system using an adaptive Kalman filter is proposed. The LPV model is comprised of scheduling parameters, and the emulator parameters. The scheduling parameters are chosen such that they are capable of tracking variations in the system model as a result of changes in the operating regimes. The emulator parameters, on the other hand, simulate variations in the subsystems during the identification phase and have negligible effect during the operational phase. The nominal model and the influence vectors, which are the gradient of the feature vector respect to the emulator parameters, are identified off-line from a number of emulator parameter perturbed experiments. A Kalman filter is designed using the identified nominal model. As the system varies, the Kalman filter model is adapted using the scheduling variables. The residual is employed for fault diagnosis. The proposed scheme is successfully evaluated on simulated system as well as on a physical process control system.Keywords: Keywords—Identification, linear parameter-varying systems, least-squares estimation, fault diagnosis, Kalman filter, emulators
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 130011074 Yield Prediction Using Support Vectors Based Under-Sampling in Semiconductor Process
Authors: Sae-Rom Pak, Seung Hwan Park, Jeong Ho Cho, Daewoong An, Cheong-Sool Park, Jun Seok Kim, Jun-Geol Baek
Abstract:
It is important to predict yield in semiconductor test process in order to increase yield. In this study, yield prediction means finding out defective die, wafer or lot effectively. Semiconductor test process consists of some test steps and each test includes various test items. In other world, test data has a big and complicated characteristic. It also is disproportionably distributed as the number of data belonging to FAIL class is extremely low. For yield prediction, general data mining techniques have a limitation without any data preprocessing due to eigen properties of test data. Therefore, this study proposes an under-sampling method using support vector machine (SVM) to eliminate an imbalanced characteristic. For evaluating a performance, randomly under-sampling method is compared with the proposed method using actual semiconductor test data. As a result, sampling method using SVM is effective in generating robust model for yield prediction.
Keywords: Yield Prediction, Semiconductor Test Process, Support Vector Machine, Under Sampling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 239811073 Foot Recognition Using Deep Learning for Knee Rehabilitation
Authors: Rakkrit Duangsoithong, Jermphiphut Jaruenpunyasak, Alba Garcia
Abstract:
The use of foot recognition can be applied in many medical fields such as the gait pattern analysis and the knee exercises of patients in rehabilitation. Generally, a camera-based foot recognition system is intended to capture a patient image in a controlled room and background to recognize the foot in the limited views. However, this system can be inconvenient to monitor the knee exercises at home. In order to overcome these problems, this paper proposes to use the deep learning method using Convolutional Neural Networks (CNNs) for foot recognition. The results are compared with the traditional classification method using LBP and HOG features with kNN and SVM classifiers. According to the results, deep learning method provides better accuracy but with higher complexity to recognize the foot images from online databases than the traditional classification method.Keywords: Convolutional neural networks, deep learning, foot recognition, knee rehabilitation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143511072 A Fuzzy Model and Tool to Analyze SIVD Diseases Using TMS
Authors: A. Faro, D. Giordano, M. Pennisi, G. Scarciofalo, C. Spampinato, F. Tramontana
Abstract:
The paper proposes a methodology to process the signals coming from the Transcranial Magnetic Stimulation (TMS) in order to identify the pathology and evaluate the therapy to treat the patients affected by demency diseases. In particular, a fuzzy model is developed to identify the demency of the patients affected by Subcortical Ischemic Vascular Dementia and to measure the positive effect, if any, of a repetitive TMS on their motor performances. A tool is also presented to support the mentioned analysis.
Keywords: TMS, SIVD, Electromiography , Fuzzy Logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157411071 Robust Batch Process Scheduling in Pharmaceutical Industries: A Case Study
Authors: Tommaso Adamo, Gianpaolo Ghiani, Antonio D. Grieco, Emanuela Guerriero
Abstract:
Batch production plants provide a wide range of scheduling problems. In pharmaceutical industries a batch process is usually described by a recipe, consisting of an ordering of tasks to produce the desired product. In this research work we focused on pharmaceutical production processes requiring the culture of a microorganism population (i.e. bacteria, yeasts or antibiotics). Several sources of uncertainty may influence the yield of the culture processes, including (i) low performance and quality of the cultured microorganism population or (ii) microbial contamination. For these reasons, robustness is a valuable property for the considered application context. In particular, a robust schedule will not collapse immediately when a cell of microorganisms has to be thrown away due to a microbial contamination. Indeed, a robust schedule should change locally in small proportions and the overall performance measure (i.e. makespan, lateness) should change a little if at all. In this research work we formulated a constraint programming optimization (COP) model for the robust planning of antibiotics production. We developed a discrete-time model with a multi-criteria objective, ordering the different criteria and performing a lexicographic optimization. A feasible solution of the proposed COP model is a schedule of a given set of tasks onto available resources. The schedule has to satisfy tasks precedence constraints, resource capacity constraints and time constraints. In particular time constraints model tasks duedates and resource availability time windows constraints. To improve the schedule robustness, we modeled the concept of (a, b) super-solutions, where (a, b) are input parameters of the COP model. An (a, b) super-solution is one in which if a variables (i.e. the completion times of a culture tasks) lose their values (i.e. cultures are contaminated), the solution can be repaired by assigning these variables values with a new values (i.e. the completion times of a backup culture tasks) and at most b other variables (i.e. delaying the completion of at most b other tasks). The efficiency and applicability of the proposed model is demonstrated by solving instances taken from a real-life pharmaceutical company. Computational results showed that the determined super-solutions are near-optimal.Keywords: Constraint programming, super-solutions, robust scheduling, batch process, pharmaceutical industries.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197511070 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management
Authors: M. Graus, K. Westhoff, X. Xu
Abstract:
The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.
Keywords: Data analytics, green production, industrial energy management, optimization, renewable energies, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173611069 Upsetting of Tri-Metallic St-Cu-Al and St-Cu60Zn-Al Cylindrical Billets
Authors: Isik Cetintav, Cenk Misirli, Yilmaz Can
Abstract:
This work investigates upsetting of the tri-metallic cylindrical billets both experimentally and analytically with a reduction ratio 30%. Steel, brass, and copper are used for the outer and outmost rings and aluminum for the inner core. Two different models have been designed to show material flow and the cavity took place over the two interfaces during forming after this reduction ratio. Each model has an outmost ring material as steel. Model 1 has an outer ring between the outmost ring and the solid core material as copper and Model 2 has a material as brass. Solid core is aluminum for each model. Billets were upset in press machine by using parallel flat dies. Upsetting load was recorded and compared for models and single billets. To extend the tests and compare with experimental procedure to a wider range of inner core and outer ring geometries, finite element model was performed. ABAQUS software was used for the simulations. The aim is to show how contact between outmost ring, outer ring and the inner core are carried on throughout the upsetting process. Results have shown that, with changing in height, between outmost ring, outer ring and inner core, the Model 1 and Model 2 had very good interaction, and the contact surfaces of models had various interface behaviour. It is also observed that tri-metallic materials have lower weight but better mechanical properties than single materials. This can give an idea for using and producing these new materials for different purposes.
Keywords: Tri-metallic, upsetting, copper, brass, steel, aluminum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 113911068 Application of Feed-Forward Neural Networks Autoregressive Models in Gross Domestic Product Prediction
Authors: Ε. Giovanis
Abstract:
In this paper we present an autoregressive model with neural networks modeling and standard error backpropagation algorithm training optimization in order to predict the gross domestic product (GDP) growth rate of four countries. Specifically we propose a kind of weighted regression, which can be used for econometric purposes, where the initial inputs are multiplied by the neural networks final optimum weights from input-hidden layer after the training process. The forecasts are compared with those of the ordinary autoregressive model and we conclude that the proposed regression-s forecasting results outperform significant those of autoregressive model in the out-of-sample period. The idea behind this approach is to propose a parametric regression with weighted variables in order to test for the statistical significance and the magnitude of the estimated autoregressive coefficients and simultaneously to estimate the forecasts.Keywords: Autoregressive model, Error back-propagation Feed-Forward neural networks, , Gross Domestic Product
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142011067 Advances on LuGre Friction Model
Authors: Mohammad Fuad Mohammad Naser, Faycal Ikhouane
Abstract:
LuGre friction model is an ordinary differential equation that is widely used in describing the friction phenomenon for mechanical systems. The importance of this model comes from the fact that it captures most of the friction behavior that has been observed including hysteresis. In this paper, we study some aspects related to the hysteresis behavior induced by the LuGre friction model.
Keywords: Hysteresis, LuGre model, operator, (strong) consistency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 271211066 A Methodology for Creating a Conceptual Model Under Uncertainty
Authors: Bogdan Walek, Jiri Bartos, Cyril Klimes
Abstract:
This article deals with the conceptual modeling under uncertainty. First, the division of information systems with their definition will be described, focusing on those where the construction of a conceptual model is suitable for the design of future information system database. Furthermore, the disadvantages of the traditional approach in creating a conceptual model and database design will be analyzed. A comprehensive methodology for the creation of a conceptual model based on analysis of client requirements and the selection of a suitable domain model is proposed here. This article presents the expert system used for the construction of a conceptual model and is a suitable tool for database designers to create a conceptual model.
Keywords: Conceptual model, conceptual modeling, database, methodology, uncertainty, information system, entity, attribute, relationship, conceptual domain model, fuzzy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158411065 Using Swarm Intelligence for Improving Accuracy of Fuzzy Classifiers
Authors: Hassan M. Elragal
Abstract:
This paper discusses a method for improving accuracy of fuzzy-rule-based classifiers using particle swarm optimization (PSO). Two different fuzzy classifiers are considered and optimized. The first classifier is based on Mamdani fuzzy inference system (M_PSO fuzzy classifier). The second classifier is based on Takagi- Sugeno fuzzy inference system (TS_PSO fuzzy classifier). The parameters of the proposed fuzzy classifiers including premise (antecedent) parameters, consequent parameters and structure of fuzzy rules are optimized using PSO. Experimental results show that higher classification accuracy can be obtained with a lower number of fuzzy rules by using the proposed PSO fuzzy classifiers. The performances of M_PSO and TS_PSO fuzzy classifiers are compared to other fuzzy based classifiersKeywords: Fuzzy classifier, Optimization of fuzzy systemparameters, Particle swarm optimization, Pattern classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234511064 Complex Dynamics of Bertrand Duopoly Games with Bounded Rationality
Authors: Jixiang Zhang, Guocheng Wang
Abstract:
A dynamic of Bertrand duopoly game is analyzed, where players use different production methods and choose their prices with bounded rationality. The equilibriums of the corresponding discrete dynamical systems are investigated. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability of Nash equilibrium, as some parameters of the model are varied, gives rise to complex dynamics such as cycles of higher order and chaos. On this basis, we discover that an increase of adjustment speed of bounded rational player can make Bertrand market sink into the chaotic state. Finally, the complex dynamics, bifurcations and chaos are displayed by numerical simulation.
Keywords: Bertrand duopoly model, Discrete dynamical system, Heterogeneous expectations, Nash equilibrium.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 259911063 Investigating Activity Recognition Using 9-Axis Sensors and Filters in Wearable Devices
Authors: Jun Gil Ahn, Jong Kang Park, Jong Tae Kim
Abstract:
In this paper, we analyze major components of activity recognition (AR) in wearable device with 9-axis sensors and sensor fusion filters. 9-axis sensors commonly include 3-axis accelerometer, 3-axis gyroscope and 3-axis magnetometer. We chose sensor fusion filters as Kalman filter and Direction Cosine Matrix (DCM) filter. We also construct sensor fusion data from each activity sensor data and perform classification by accuracy of AR using Naïve Bayes and SVM. According to the classification results, we observed that the DCM filter and the specific combination of the sensing axes are more effective for AR in wearable devices while classifying walking, running, ascending and descending.Keywords: Accelerometer, activity recognition, directional cosine matrix filter, gyroscope, Kalman filter, magnetometer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167411062 Designing Information Systems in Education as Prerequisite for Successful Management Results
Authors: Vladimir Simovic, Matija Varga, Tonco Marusic
Abstract:
This research paper shows matrix technology models and examples of information systems in education (in the Republic of Croatia and in the Germany) in support of business, education (when learning and teaching) and e-learning. Here we researched and described the aims and objectives of the main process in education and technology, with main matrix classes of data. In this paper, we have example of matrix technology with detailed description of processes related to specific data classes in the processes of education and an example module that is support for the process: ‘Filling in the directory and the diary of work’ and ‘evaluation’. Also, on the lower level of the processes, we researched and described all activities which take place within the lower process in education. We researched and described the characteristics and functioning of modules: ‘Fill the directory and the diary of work’ and ‘evaluation’. For the analysis of the affinity between the aforementioned processes and/or sub-process we used our application model created in Visual Basic, which was based on the algorithm for analyzing the affinity between the observed processes and/or sub-processes.Keywords: Designing, education management, information systems, matrix technology, process affinity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109611061 A Tuning Method for Microwave Filter via Complex Neural Network and Improved Space Mapping
Authors: Shengbiao Wu, Weihua Cao, Min Wu, Can Liu
Abstract:
This paper presents an intelligent tuning method of microwave filter based on complex neural network and improved space mapping. The tuning process consists of two stages: the initial tuning and the fine tuning. At the beginning of the tuning, the return loss of the filter is transferred to the passband via the error of phase. During the fine tuning, the phase shift caused by the transmission line and the higher order mode is removed by the curve fitting. Then, an Cauchy method based on the admittance parameter (Y-parameter) is used to extract the coupling matrix. The influence of the resonant cavity loss is eliminated during the parameter extraction process. By using processed data pairs (the amount of screw variation and the variation of the coupling matrix), a tuning model is established by the complex neural network. In view of the improved space mapping algorithm, the mapping relationship between the actual model and the ideal model is established, and the amplitude and direction of the tuning is constantly updated. Finally, the tuning experiment of the eight order coaxial cavity filter shows that the proposed method has a good effect in tuning time and tuning precision.Keywords: Microwave filter, scattering parameter (s-parameter), coupling matrix, intelligent tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131511060 Agreement Options in Multi-person Decision on Optimizing High-Rise Building Columns
Authors: Christiono Utomo, Arazi Idrus, Madzlan Napiah, Mohd. Faris Khamidi
Abstract:
This paper presents a conceptual model of agreement options for negotiation support in multi-person decision on optimizing high-rise building columns. The decision is complicated since many parties involved in choosing a single alternative from a set of solutions. There are different concern caused by differing preferences, experiences, and background. Such building columns as alternatives are referred to as agreement options which are determined by identifying the possible decision maker group, followed by determining the optimal solution for each group. The group in this paper is based on three-decision makers preferences that are designer, programmer, and construction manager. Decision techniques applied to determine the relative value of the alternative solutions for performing the function. Analytical Hierarchy Process (AHP) was applied for decision process and game theory based agent system for coalition formation. An n-person cooperative game is represented by the set of all players. The proposed coalition formation model enables each agent to select individually its allies or coalition. It further emphasizes the importance of performance evaluation in the design process and value-based decision.Keywords: Agreement options, coalition, group choice, game theory, building columns selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 162511059 Probe of Crack Initiate at the Toe of Concrete Gravity Dam using Numerical Analysis
Authors: M. S. Salimi, H. Kiamanesh, N. Hedayat
Abstract:
In this survey the process of crack propagation at the toe of concrete gravity dam is investigated by applying principals and criteria of linear elastic fracture mechanic. Simulating process of earthquake conditions for three models of dam with different geometrical condition, in empty reservoir under plain stress is calculated through special fracture mechanic software FRANNC2D [1] for determining fracture mechanic criteria. The outcomes showed that in spite of the primary expectations, the simultaneous existence of fillet in both toe and heel area (model 3), the rate of maximum principal stress has not been decreased; however, even the maximum principal stress has increased, so it caused stress intensity factors increase which is undesirable. On the other hand, the dam with heel fillet has shown the best attitude and it is because of items like decreasing the rates of maximum and minimum principal stresses and also is related to decreasing the rates of stress intensity factors for 1st & 2nd modes of the model.Keywords: Stress intensity factor, concrete gravity dam, numerical analysis, geometry of toe.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174011058 Software Product Quality Evaluation Model with Multiple Criteria Decision Making Analysis
Authors: C. Ardil
Abstract:
This paper presents a software product quality evaluation model based on the ISO/IEC 25010 quality model. The evaluation characteristics and sub characteristics were identified from the ISO/IEC 25010 quality model. The multidimensional structure of the quality model is based on characteristics such as functional suitability, performance efficiency, compatibility, usability, reliability, security, maintainability, and portability, and associated sub characteristics. Random numbers are generated to establish the decision maker’s importance weights for each sub characteristics. Also, random numbers are generated to establish the decision matrix of the decision maker’s final scores for each software product against each sub characteristics. Thus, objective criteria importance weights and index scores for datasets were obtained from the random numbers. In the proposed model, five different software product quality evaluation datasets under three different weight vectors were applied to multiple criteria decision analysis method, preference analysis for reference ideal solution (PARIS) for comparison, and sensitivity analysis procedure. This study contributes to provide a better understanding of the application of MCDMA methods and ISO/IEC 25010 quality model guidelines in software product quality evaluation process.
Keywords: ISO/IEC 25010 quality model, multiple criteria decisions making, multiple criteria decision making analysis, MCDMA, PARIS, Software Product Quality Evaluation Model, Software Product Quality Evaluation, Software Evaluation, Software Selection, Software
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44911057 Phytoadaptation in Desert Soil Prediction Using Fuzzy Logic Modeling
Authors: S. Bouharati, F. Allag, M. Belmahdi, M. Bounechada
Abstract:
In terms of ecology forecast effects of desertification, the purpose of this study is to develop a predictive model of growth and adaptation of species in arid environment and bioclimatic conditions. The impact of climate change and the desertification phenomena is the result of combined effects in magnitude and frequency of these phenomena. Like the data involved in the phytopathogenic process and bacteria growth in arid soil occur in an uncertain environment because of their complexity, it becomes necessary to have a suitable methodology for the analysis of these variables. The basic principles of fuzzy logic those are perfectly suited to this process. As input variables, we consider the physical parameters, soil type, bacteria nature, and plant species concerned. The result output variable is the adaptability of the species expressed by the growth rate or extinction. As a conclusion, we prevent the possible strategies for adaptation, with or without shifting areas of plantation and nature adequate vegetation.
Keywords: Climate changes, dry soil, Phytopathogenicity, Predictive model, Fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 187711056 Day/Night Detector for Vehicle Tracking in Traffic Monitoring Systems
Authors: M. Taha, Hala H. Zayed, T. Nazmy, M. Khalifa
Abstract:
Recently, traffic monitoring has attracted the attention of computer vision researchers. Many algorithms have been developed to detect and track moving vehicles. In fact, vehicle tracking in daytime and in nighttime cannot be approached with the same techniques, due to the extreme different illumination conditions. Consequently, traffic-monitoring systems are in need of having a component to differentiate between daytime and nighttime scenes. In this paper, a HSV-based day/night detector is proposed for traffic monitoring scenes. The detector employs the hue-histogram and the value-histogram on the top half of the image frame. Experimental results show that the extraction of the brightness features along with the color features within the top region of the image is effective for classifying traffic scenes. In addition, the detector achieves high precision and recall rates along with it is feasible for real time applications.Keywords: Day/night detector, daytime/nighttime classification, image classification, vehicle tracking, traffic monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 450811055 Analyzing Periurban Fringe with Rough Set
Authors: Benedetto Manganelli, Beniamino Murgante
Abstract:
The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.
Keywords: Land Classification, Map Algebra, Periurban Fringe, Rough Set, Urban Planning, Urban Sprawl.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 172411054 Trajectory-Based Modified Policy Iteration
Abstract:
This paper presents a new problem solving approach that is able to generate optimal policy solution for finite-state stochastic sequential decision-making problems with high data efficiency. The proposed algorithm iteratively builds and improves an approximate Markov Decision Process (MDP) model along with cost-to-go value approximates by generating finite length trajectories through the state-space. The approach creates a synergy between an approximate evolving model and approximate cost-to-go values to produce a sequence of improving policies finally converging to the optimal policy through an intelligent and structured search of the policy space. The approach modifies the policy update step of the policy iteration so as to result in a speedy and stable convergence to the optimal policy. We apply the algorithm to a non-holonomic mobile robot control problem and compare its performance with other Reinforcement Learning (RL) approaches, e.g., a) Q-learning, b) Watkins Q(λ), c) SARSA(λ).Keywords: Markov Decision Process (MDP), Mobile robot, Policy iteration, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144611053 Kinetic model and Simulation Analysis for Propane Dehydrogenation in an Industrial Moving Bed Reactor
Authors: Chin S. Y., Radzi, S. N. R., Maharon, I. H., Shafawi, M. A.
Abstract:
A kinetic model for propane dehydrogenation in an industrial moving bed reactor is developed based on the reported reaction scheme. The kinetic parameters and activity constant are fine tuned with several sets of balanced plant data. Plant data at different operating conditions is applied to validate the model and the results show a good agreement between the model predictions and plant observations in terms of the amount of main product, propylene produced. The simulation analysis of key variables such as inlet temperature of each reactor (Tinrx) and hydrogen to total hydrocarbon ratio (H2/THC) affecting process performance is performed to identify the operating condition to maximize the production of propylene. Within the range of operating conditions applied in the present studies, the operating condition to maximize the propylene production at the same weighted average inlet temperature (WAIT) is ΔTinrx1= -2, ΔTinrx2= +1, ΔTinrx3= +1 , ΔTinrx4= +2 and ΔH2/THC= -0.02. Under this condition, the surplus propylene produced is 7.07 tons/day as compared with base case.Keywords: kinetic model, dehydrogenation, simulation, modeling, propane
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 443411052 Wavelet Feature Selection Approach for Heart Murmur Classification
Authors: G. Venkata Hari Prasad, P. Rajesh Kumar
Abstract:
Phonocardiography is important in appraisal of congenital heart disease and pulmonary hypertension as it reflects the duration of right ventricular systoles. The systolic murmur in patients with intra-cardiac shunt decreases as pulmonary hypertension develops and may eventually disappear completely as the pulmonary pressure reaches systemic level. Phonocardiography and auscultation are non-invasive, low-cost, and accurate methods to assess heart disease. In this work an objective signal processing tool to extract information from phonocardiography signal using Wavelet is proposed to classify the murmur as normal or abnormal. Since the feature vector is large, a Binary Particle Swarm Optimization (PSO) with mutation for feature selection is proposed. The extracted features improve the classification accuracy and were tested across various classifiers including Naïve Bayes, kNN, C4.5, and SVM.Keywords: Phonocardiography, Coiflet, Feature selection, Particle Swarm Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2473