Search results for: Approximation Distribution Reductions in Multigranulation Rough Set Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9129

Search results for: Approximation Distribution Reductions in Multigranulation Rough Set Model

4209 An Efficient Adaptive Thresholding Technique for Wavelet Based Image Denoising

Authors: D.Gnanadurai, V.Sadasivam

Abstract:

This frame work describes a computationally more efficient and adaptive threshold estimation method for image denoising in the wavelet domain based on Generalized Gaussian Distribution (GGD) modeling of subband coefficients. In this proposed method, the choice of the threshold estimation is carried out by analysing the statistical parameters of the wavelet subband coefficients like standard deviation, arithmetic mean and geometrical mean. The noisy image is first decomposed into many levels to obtain different frequency bands. Then soft thresholding method is used to remove the noisy coefficients, by fixing the optimum thresholding value by the proposed method. Experimental results on several test images by using this method show that this method yields significantly superior image quality and better Peak Signal to Noise Ratio (PSNR). Here, to prove the efficiency of this method in image denoising, we have compared this with various denoising methods like wiener filter, Average filter, VisuShrink and BayesShrink.

Keywords: Wavelet Transform, Gaussian Noise, ImageDenoising, Filter Banks and Thresholding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2907
4208 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan

Abstract:

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Keywords: Cloud forensics, data protection laws, GDPR, IoT forensics, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
4207 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru

Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar

Abstract:

Nowadays, Heritage Building Information Modeling (HBIM) is considered an efficient tool to represent and manage information of Cultural Heritage (CH). The basis of this tool relies on a 3D model generally obtained from a Cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired Level of Development (LOD), Level of Information (LOI), Grade of Generation (GOG) as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models’ families respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources, since the BIM software used has a free student license.

Keywords: Cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 928
4206 Real-Time Specific Weed Recognition System Using Histogram Analysis

Authors: Irshad Ahmad, Abdul Muhamin Naeem, Muhammad Islam

Abstract:

Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Analysis of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Keywords: Image Processing, real-time recognition, Weeddetection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
4205 Bayesian Inference for Phase Unwrapping Using Conjugate Gradient Method in One and Two Dimensions

Authors: Yohei Saika, Hiroki Sakaematsu, Shota Akiyama

Abstract:

We investigated statistical performance of Bayesian inference using maximum entropy and MAP estimation for several models which approximated wave-fronts in remote sensing using SAR interferometry. Using Monte Carlo simulation for a set of wave-fronts generated by assumed true prior, we found that the method of maximum entropy realized the optimal performance around the Bayes-optimal conditions by using model of the true prior and the likelihood representing optical measurement due to the interferometer. Also, we found that the MAP estimation regarded as a deterministic limit of maximum entropy almost achieved the same performance as the Bayes-optimal solution for the set of wave-fronts. Then, we clarified that the MAP estimation perfectly carried out phase unwrapping without using prior information, and also that the MAP estimation realized accurate phase unwrapping using conjugate gradient (CG) method, if we assumed the model of the true prior appropriately.

Keywords: Bayesian inference using maximum entropy, MAP estimation using conjugate gradient method, SAR interferometry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
4204 A Study on Early Prediction of Fault Proneness in Software Modules using Genetic Algorithm

Authors: Parvinder S. Sandhu, Sunil Khullar, Satpreet Singh, Simranjit K. Bains, Manpreet Kaur, Gurvinder Singh

Abstract:

Fault-proneness of a software module is the probability that the module contains faults. To predict faultproneness of modules different techniques have been proposed which includes statistical methods, machine learning techniques, neural network techniques and clustering techniques. The aim of proposed study is to explore whether metrics available in the early lifecycle (i.e. requirement metrics), metrics available in the late lifecycle (i.e. code metrics) and metrics available in the early lifecycle (i.e. requirement metrics) combined with metrics available in the late lifecycle (i.e. code metrics) can be used to identify fault prone modules using Genetic Algorithm technique. This approach has been tested with real time defect C Programming language datasets of NASA software projects. The results show that the fusion of requirement and code metric is the best prediction model for detecting the faults as compared with commonly used code based model.

Keywords: Genetic Algorithm, Fault Proneness, Software Faultand Software Quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984
4203 Proportionally Damped Finite Element State-Space Model of Composite Laminated Plate with Localized Interface Degeneration

Authors: Shi Qi Koo, Ahmad Beng Hong Kueh

Abstract:

In the present work, the finite element formulation for the investigation of the effects of a localized interfacial degeneration on the dynamic behavior of the [90°/0°] laminated composite plate employing the state-space technique is performed. The stiffness of the laminate is determined by assembling the stiffnesses of subelements. This includes an introduction of an interface layer adopting the virtually zero-thickness formulation to model the interfacial degeneration. Also, the kinematically consistent mass matrix and proportional damping have been formulated to complete the free vibration governing expression. To simulate the interfacial degeneration of the laminate, the degenerated areas are defined from the center propagating outwards in a localized manner. It is found that the natural frequency, damped frequency and damping ratio of the plate decreases as the degenerated area of the interface increases. On the contrary, the loss factor increases correspondingly.

Keywords: Dynamic finite element, localized interface degeneration, proportional damping, state-space modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2082
4202 The Pressure Losses in the Model of Human Lungs

Authors: Michaela Chovancova, Pavel Niedoba

Abstract:

For the treatment of acute and chronic lung diseases it is preferred to deliver medicaments by inhalation. The drug is delivered directly to tracheobronchial tree. This way allows the given medicament to get directly into the place of action and it makes rapid onset of action and maximum efficiency. The transport of aerosol particles in the particular part of the lung is influenced by their size, anatomy of the lungs, breathing pattern and airway resistance. This article deals with calculation of airway resistance in the lung model of Horsfield. It solves the problem of determination of the pressure losses in bifurcation and thus defines the pressure drop at a given location in the bronchial tree. The obtained data will be used as boundary conditions for transport of aerosol particles in a central part of bronchial tree realized by Computational Fluid Dynamics (CFD) approach. The results obtained from CFD simulation will allow us to provide information on the required particle size and optimal inhalation technique for particle transport into particular part of the lung.

Keywords: Human lungs, bronchial tree, pressure losses, airways resistance, flow, breathing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2559
4201 Optimized Fuzzy Control by Particle Swarm Optimization Technique for Control of CSTR

Authors: Saeed Vaneshani, Hooshang Jazayeri-Rad

Abstract:

Fuzzy logic control (FLC) systems have been tested in many technical and industrial applications as a useful modeling tool that can handle the uncertainties and nonlinearities of modern control systems. The main drawback of the FLC methodologies in the industrial environment is challenging for selecting the number of optimum tuning parameters. In this paper, a method has been proposed for finding the optimum membership functions of a fuzzy system using particle swarm optimization (PSO) algorithm. A synthetic algorithm combined from fuzzy logic control and PSO algorithm is used to design a controller for a continuous stirred tank reactor (CSTR) with the aim of achieving the accurate and acceptable desired results. To exhibit the effectiveness of proposed algorithm, it is used to optimize the Gaussian membership functions of the fuzzy model of a nonlinear CSTR system as a case study. It is clearly proved that the optimized membership functions (MFs) provided better performance than a fuzzy model for the same system, when the MFs were heuristically defined.

Keywords: continuous stirred tank reactor (CSTR), fuzzy logiccontrol (FLC), membership function(MF), particle swarmoptimization (PSO)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3201
4200 Model Parameters Estimating on Lyman–Kutcher–Burman Normal Tissue Complication Probability for Xerostomia on Head and Neck Cancer

Authors: Tsair-Fwu Lee , Hui-Min Ting , Pei-Ju Chao, Jing-Chuan Jiang, Min-Yuan Chao, Wen-Cheng Chen, Long-Chang Chen, Jia-Ming Wu

Abstract:

The purpose of this study is to derive parameters estimating for the Lyman–Kutcher–Burman (LKB) normal tissue complication probability (NTCP) model using analysis of scintigraphy assessments and quality of life (QoL) measurement questionnaires for the parotid gland (xerostomia). In total, 31 patients with head-and-neck (HN) cancer were enrolled. Salivary excretion factor (SEF) and EORTC QLQ-H&N35 questionnaires datasets are used for the NTCP modeling to describe the incidence of grade 4 xerostomia. Assuming that n= 1, NTCP fitted parameters are given as TD50= 43.6 Gy, m= 0.18 in SEF analysis, and as TD50= 44.1 Gy, m= 0.11 in QoL measurements, respectively. SEF and QoL datasets can validate the Quantitative Analyses of Normal Tissue Effects in the Clinic (QUANTEC) guidelines well, resulting in NPV-s of 100% for the both datasets and suggests that the QUANTEC 25/20Gy gland-spared guidelines are suitable for clinical used for the HN cohort to effectively avoid xerostomia.

Keywords: HN, NTCP, SEF, QoL, QUANTEC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
4199 Software Engineering Inspired Cost Estimation for Process Modelling

Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller

Abstract:

Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development process and the process of process modelling which is a phase of the Business Process Management life-cycle.

Keywords: Cost Estimation, Effort Estimation, Process Modelling, Business Process Management, COCOMO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2295
4198 A Fuzzy Swarm Optimized Approach for Piece Selection in Bit Torrent Like Peer to Peer Network

Authors: M. Padmavathi, R. M. Suresh

Abstract:

Every machine plays roles of client and server simultaneously in a peer-to-peer (P2P) network. Though a P2P network has many advantages over traditional client-server models regarding efficiency and fault-tolerance, it also faces additional security threats. Users/IT administrators should be aware of risks from malicious code propagation, downloaded content legality, and P2P software’s vulnerabilities. Security and preventative measures are a must to protect networks from potential sensitive information leakage and security breaches. Bit Torrent is a popular and scalable P2P file distribution mechanism which successfully distributes large files quickly and efficiently without problems for origin server. Bit Torrent achieved excellent upload utilization according to measurement studies, but it also raised many questions as regards utilization in settings, than those measuring, fairness, and Bit Torrent’s mechanisms choice. This work proposed a block selection technique using Fuzzy ACO with optimal rules selected using ACO.

Keywords: Ant Colony Optimization (ACO), Bit Torrent, Download time, Peer-to-Peer (P2P) network, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2588
4197 Depth-Averaged Modelling of Erosion and Sediment Transport in Free-Surface Flows

Authors: Thomas Rowan, Mohammed Seaid

Abstract:

A fast finite volume solver for multi-layered shallow water flows with mass exchange and an erodible bed is developed. This enables the user to solve a number of complex sediment-based problems including (but not limited to), dam-break over an erodible bed, recirculation currents and bed evolution as well as levy and dyke failure. This research develops methodologies crucial to the under-standing of multi-sediment fluvial mechanics and waterway design. In this model mass exchange between the layers is allowed and, in contrast to previous models, sediment and fluid are able to transfer between layers. In the current study we use a two-step finite volume method to avoid the solution of the Riemann problem. Entrainment and deposition rates are calculated for the first time in a model of this nature. In the first step the governing equations are rewritten in a non-conservative form and the intermediate solutions are calculated using the method of characteristics. In the second stage, the numerical fluxes are reconstructed in conservative form and are used to calculate a solution that satisfies the conservation property. This method is found to be considerably faster than other comparative finite volume methods, it also exhibits good shock capturing. For most entrainment and deposition equations a bed level concentration factor is used. This leads to inaccuracies in both near bed level concentration and total scour. To account for diffusion, as no vertical velocities are calculated, a capacity limited diffusion coefficient is used. The additional advantage of this multilayer approach is that there is a variation (from single layer models) in bottom layer fluid velocity: this dramatically reduces erosion, which is often overestimated in simulations of this nature using single layer flows. The model is used to simulate a standard dam break. In the dam break simulation, as expected, the number of fluid layers utilised creates variation in the resultant bed profile, with more layers offering a higher deviation in fluid velocity . These results showed a marked variation in erosion profiles from standard models. The overall the model provides new insight into the problems presented at minimal computational cost.

Keywords: Erosion, finite volume method, sediment transport, shallow water equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 990
4196 Preparation and Investigation of Photocatalytic Properties of ZnO Nanocrystals: Effect of Operational Parameters and Kinetic Study

Authors: N. Daneshvar, S. Aber, M. S. Seyed Dorraji, A. R. Khataee, M. H. Rasoulifard

Abstract:

ZnO nanocrystals with mean diameter size 14 nm have been prepared by precipitation method, and examined as photocatalyst for the UV-induced degradation of insecticide diazinon as deputy of organic pollutant in aqueous solution. The effects of various parameters, such as illumination time, the amount of photocatalyst, initial pH values and initial concentration of insecticide on the photocatalytic degradation diazinon were investigated to find desired conditions. In this case, the desired parameters were also tested for the treatment of real water containing the insecticide. Photodegradation efficiency of diazinon was compared between commercial and prepared ZnO nanocrystals. The results indicated that UV/ZnO process applying prepared nanocrystalline ZnO offered electrical energy efficiency and quantum yield better than commercial ZnO. The present study, on the base of Langmuir-Hinshelwood mechanism, illustrated a pseudo first-order kinetic model with rate constant of surface reaction equal to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l mg-1.

Keywords: Zinc oxide nanopowder, Electricity consumption, Quantum yield, Nanoparticles, Photodegradation, Kinetic model, Insecticide.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3569
4195 Thermohydraulic Performance of Double Flow Solar Air Heater with Corrugated Absorber

Authors: S. P. Sharma, Som Nath Saha

Abstract:

This paper deals with the analytical investigation of thermal and thermohydraulic performance of double flow solar air heaters with corrugated and flat plate absorber. A mathematical model of double flow solar air heater has been presented, and a computer program in C++ language is developed to estimate the outlet temperature of air for the evaluation of thermal and thermohydraulic efficiency by solving the governing equations numerically using relevant correlations for heat transfer coefficients. The results obtained from the mathematical model is compared with the available experimental results and it is found to be reasonably good. The results show that the double flow solar air heaters have higher efficiency than conventional solar air heater, although the double flow corrugated absorber is superior to that of flat plate double flow solar air heater. It is also observed that the thermal efficiency increases with increase in mass flow rate; however, thermohydraulic efficiency increases with increase in mass flow rate up to a certain limit, attains the maximum value, then thereafter decreases sharply.

Keywords: Corrugated absorber, double flow, solar air heater, thermohydraulic efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
4194 Mathematical Model of Smoking Time Temperature Effect on Ribbed Smoked Sheets Quality

Authors: Rifah Ediati, Jajang

Abstract:

The quality of Ribbed Smoked Sheets (RSS) primarily based on color, dryness, and the presence or absence of fungus and bubbles. This quality is strongly influenced by the drying and fumigation process namely smoking process. Smoking that is held in high temperature long time will result scorched dark brown sheets, whereas if the temperature is too low or slow drying rate would resulted in less mature sheets and growth of fungus. Therefore need to find the time and temperature for optimum quality of sheets. Enhance, unmonitored heat and mass transfer during smoking process lead to high losses of energy balance. This research aims to generate simple empirical mathematical model describing the effect of smoking time and temperature to RSS quality of color, water content, fungus and bubbles. The second goal of study was to analyze energy balance during smoking process. Experimental study was conducted by measuring temperature, residence time and quality parameters of 16 sheets sample in smoking rooms. Data for energy consumption balance such as mass of fuel wood, mass of sheets being smoked, construction temperature, ambient temperature and relative humidity were taken directly along the smoking process. It was found that mathematical model correlating smoking temperature and time with color is Color = -169 - 0.184 T4 - 0.193 T3 - 0.160 0.405 T1 + T2 + 0.388 t1 +3.11 t2 + 3.92t3 + 0.215 t4 with R square 50.8% and with moisture is Moisture = -1.40-0.00123 T4 + 0.00032 T3 + 0.00260 T2 - 0.00292 T1 - 0.0105 t1 + 0.0290 t2 + 0.0452 t3 + 0.00061 t4 with R square of 49.9%. Smoking room energy analysis found useful energy was 27.8%. The energy stored in the material construction 7.3%. Lost of energy in conversion of wood combustion, ventilation and others were 16.6%. The energy flowed out through the contact of material construction with the ambient air was found to be the highest contribution to energy losses, it reached 48.3%.

Keywords: RSS quality, temperature, time, smoking room, energy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2758
4193 The Effect of Failure Rate on Repair and Maintenance Costs of Four Agricultural Tractor Models

Authors: Fatemeh Afsharnia, Mohammad Amin Asoodar, Abbas Abdeshahi

Abstract:

In economical evaluation literature, although the combination of some variables such as repair and maintenance costs and accumulated use hours has been widely considered in determining of optimum life for tractor, no investigation has indicated the influence of failure rate on repair and maintenance costs. In this study, the owners of three hundred tractors, which include Massey Ferguson, John Deere and Universal, were interviewed, from five regions of Khouzestan Province. A regression model was used to predict the tractors annual repair and maintenance costs based on failure rate. Results showed that the maximum percentage of annual repair and maintenance costs occurred in engine parts for MF285, JD3140 and U650 tractors while these costs for tire, ring, ball bearing and operator seat were higher compared to other MF399 tractor systems. According to the results of the regression, the failure rate increase would lead to annual repair and maintenance costs increase for all tractors. But, of all the tractors, repair and maintenance costs of JD3140 tractors extremely affected by the failure rate increase.

Keywords: Failure rate, tractor, annual repair and maintenance costs, regression model, Khouzestan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4309
4192 Iraqi Short Term Electrical Load Forecasting Based On Interval Type-2 Fuzzy Logic

Authors: Firas M. Tuaimah, Huda M. Abdul Abbas

Abstract:

Accurate Short Term Load Forecasting (STLF) is essential for a variety of decision making processes. However, forecasting accuracy can drop due to the presence of uncertainty in the operation of energy systems or unexpected behavior of exogenous variables. Interval Type 2 Fuzzy Logic System (IT2 FLS), with additional degrees of freedom, gives an excellent tool for handling uncertainties and it improved the prediction accuracy. The training data used in this study covers the period from January 1, 2012 to February 1, 2012 for winter season and the period from July 1, 2012 to August 1, 2012 for summer season. The actual load forecasting period starts from January 22, till 28, 2012 for winter model and from July 22 till 28, 2012 for summer model. The real data for Iraqi power system which belongs to the Ministry of Electricity.

Keywords: Short term load forecasting, prediction interval, type 2 fuzzy logic systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
4191 Numerical Investigation for External Strengthening of Dapped-End Beams

Authors: A. Abdel-Moniem, H. Madkour, K. Farah, A. Abdullah

Abstract:

The reduction in dapped end beams depth nearby the supports tends to produce stress concentration and hence results in shear cracks, if it does not have an adequate reinforcement detailing. This study investigates numerically the efficiency of applying different external strengthening techniques to the dapped end of such beams. A two-dimensional finite element model was built to predict the structural behavior of dapped ends strengthened with different techniques. The techniques included external bonding of the steel angle at the re-entrant corner, un-bounded bolt anchoring, external steel plate jacketing, exterior carbon fiber wrapping and/or stripping and external inclined steel plates. The FE analysis results are then presented in terms of the ultimate load capacities, load-deflection and crack pattern at failure. The results showed that the FE model, at various stages, was found to be comparable to the available test data. Moreover, it enabled the capture of the failure progress, with acceptable accuracy, which is very difficult in a laboratory test.

Keywords: Dapped-end beams, finite element, shear failure, strengthening techniques, reinforced concrete, numerical investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1004
4190 Collision Detection Algorithm Based on Data Parallelism

Authors: Zhen Peng, Baifeng Wu

Abstract:

Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.

Keywords: Data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235
4189 Malaria Prone Zones of West Bengal: A Spatio-Temporal Scenario

Authors: Meghna Maiti, Utpal Roy

Abstract:

In India, till today, malaria is considered to be one of the significant infectious diseases. Most of the cases regional geographical factors are the principal elements to let the places a unique identity. The incidence and intensity of infectious diseases are quite common and affect different places differently across the nation. The present study aims to identify spatial clusters of hot spots and cold spots of malaria incidence and their seasonal variation during the three periods of 2012-2014, 2015-2017 and 2018-20 in the state of West Bengal in India. As malaria is a vector-borne disease, numbers of positive test results are to be reported by the laboratories to the Department of Health, West Bengal (through the National Vector Borne Disease Control Programme). Data on block-wise monthly malaria positive cases are collected from Health Management Information System (HMIS), Ministry of Health and Family Welfare, Government of India. Moran’s I statistic is performed to assess the spatial autocorrelation of malaria incidence. The spatial statistical analysis mainly Local Indicators of Spatial Autocorrelation (LISA) cluster and Local Geary Cluster are applied to find the spatial clusters of hot spots and cold spots and seasonal variability of malaria incidence over the three periods. The result indicates that the spatial distribution of malaria is clustered during each of the three periods of 2012-2014, 2015-2017 and 2018-20. The analysis shows that in all the cases, high-high clusters are primarily concentrated in the western (Purulia, Paschim Medinipur districts), central (Maldah, Murshidabad districts) and the northern parts (Jalpaiguri, Kochbihar districts) and low-low clusters are found in the lower Gangetic plain (central-south) mainly and northern parts of West Bengal during the stipulated period. Apart from this seasonal variability inter-year variation is also visible. The results from different methods of this study indicate significant variation in the spatial distribution of malaria incidence in West Bengal and high incidence clusters are primarily persistently concentrated over the western part during 2012-2020 along with a strong seasonal pattern with a peak in rainy and autumn. By applying the different techniques in identifying the different degrees of incidence zones of malaria across West Bengal, some specific pockets or malaria hotspots are marked and identified where the incidence rates are quite harmonious over the different periods. From this analysis, it is clear that malaria is not a disease that is distributed uniformly across the state; some specific pockets are more prone to be affected in particular seasons of each year. Disease ecology and spatial patterns must be the factors in explaining the real factors for the higher incidence of this issue within those affected districts. The further study mainly by applying empirical approach is needed for discerning the strong relationship between communicable disease and other associated affecting factors.

Keywords: Malaria, infectious diseases, spatial statistics, spatial autocorrelation, LISA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 537
4188 Customers 50+ Behavior in the Financial Market in the Czech Republic

Authors: K. Matušínská, H. Starzyczná, M. Stoklasa

Abstract:

The paper deals with behaviour of the segment 50+ in the financial market in the Czech Republic. This segment could be said as the strong market power and it can be a crucial business potential for financial business units. The main defined objective of this paper is analysis of the customers´ behaviour of the segment 50- 60 years in the financial market in the Czech Republic and proposal making of the suitable marketing approach to satisfy their demands in the area of product, price, distribution and marketing communication policy. This paper is based on data from one part of primary marketing research. Paper determinates the basic problem areas as well as definition of financial services marketing, defining the primary research problem, hypothesis and primary research methodology. Finally suitable marketing approach to selected sub segment at age of 50-60 years is proposed according to marketing research findings.

Keywords: Population aging in the Czech Republic, Segment 50-60 years, Financial services marketing, Marketing research, Marketing approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2046
4187 Topology Optimization of Structures with Web-Openings

Authors: D. K. Lee, S. M. Shin, J. H. Lee

Abstract:

Topology optimization technique utilizes constant element densities as design parameters. Finally, optimal distribution contours of the material densities between voids (0) and solids (1) in design domain represent the determination of topology. It means that regions with element density values become occupied by solids in design domain, while there are only void phases in regions where no density values exist. Therefore the void regions of topology optimization results provide design information to decide appropriate depositions of web-opening in structure. Contrary to the basic objective of the topology optimization technique which is to obtain optimal topology of structures, this present study proposes a new idea that topology optimization results can be also utilized for decision of proper web-opening’s position. Numerical examples of linear elastostatic structures demonstrate efficiency of methodological design processes using topology optimization in order to determinate the proper deposition of web-openings.

Keywords: Topology optimization, web-opening, structure, element density, material.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1424
4186 Regional Aircraft Selection Using Preference Analysis for Reference Ideal Solution (PARIS)

Authors: C. Ardil

Abstract:

The paper presents a multiple criteria decision making analysis process to determine the most suitable regional aircraft type according to a set of evaluation criteria. The main purpose of this study is to use different decision making methods to determine the most suitable regional aircraft for aviation operators. In this context, the nine regional aircraft types were analyzed using multiple criteria decision making analysis methods. Preference analysis for reference ideal solution (PARIS) was used in regional aircraft selection process. The findings of the proposed model show that the ranking results of the multiple criteria decision making models are consistent with each other, and the proposed method is efficient, and the results are valid. Finally, the Embraer E195-E2 model regional aircraft is chosen as the most suitable aircraft type.

Keywords: aircraft, regional aircraft selection, multiple criteria decision making, multiple criteria decision making analysis, mean weight, entropy weight, MCDMA, PARIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 449
4185 Level of Service Based Methodology for Municipal Infrastructure Management

Authors: Z. Khan, O. Moselhi, T. Zayed

Abstract:

Development of levels of service in municipal context is a flexible vehicle to assist in performing quality-cost trade-off analysis for municipal services. This trade-off depends on the willingness of a community to pay as well as on the condition of the assets. Community perspective of the performance of an asset from service point of view may be quite different from the municipality perspective of the performance of the same asset from condition point of view. This paper presents a three phased level of service based methodology for water mains that consists of :1)development of an Analytical Hierarchy model of level of service 2) development of Fuzzy Weighted Sum model of water main condition index and 3) deriving a Fuzzy logic based function that maps level of service to asset condition index. This mapping will assist asset managers in quantifying condition improvement requirement to meet service goals and to make more informed decisions on interventions and relayed priorities.

Keywords: Asset Management, Level of Service, Condition Index, Analytical Hierarchy, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
4184 A Decision Support Tool for Evaluating Mobility Projects

Authors: H. Omrani, P. Gerber

Abstract:

Success is a European project that will implement several clean transport offers in three European cities and evaluate the environmental impacts. The goal of these measures is to improve urban mobility or the displacement of residents inside cities. For e.g. park and ride, electric vehicles, hybrid bus and bike sharing etc. A list of 28 criteria and 60 measures has been established for evaluation of these transport projects. The evaluation criteria can be grouped into: Transport, environment, social, economic and fuel consumption. This article proposes a decision support system based that encapsulates a hybrid approach based on fuzzy logic, multicriteria analysis and belief theory for the evaluation of impacts of urban mobility solutions. A web-based tool called DeSSIA (Decision Support System for Impacts Assessment) has been developed that treats complex data. The tool has several functionalities starting from data integration (import of data), evaluation of projects and finishes by graphical display of results. The tool development is based on the concept of MVC (Model, View, and Controller). The MVC is a conception model adapted to the creation of software's which impose separation between data, their treatment and presentation. Effort is laid on the ergonomic aspects of the application. It has codes compatible with the latest norms (XHTML, CSS) and has been validated by W3C (World Wide Web Consortium). The main ergonomic aspect focuses on the usability of the application, ease of learning and adoption. By the usage of technologies such as AJAX (XML and Java Script asynchrones), the application is more rapid and convivial. The positive points of our approach are that it treats heterogeneous data (qualitative, quantitative) from various information sources (human experts, survey, sensors, model etc.).

Keywords: Decision support tool, hybrid approach, urban mobility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994
4183 One Hour Ahead Load Forecasting Using Artificial Neural Network for the Western Area of Saudi Arabia

Authors: A. J. Al-Shareef, E. A. Mohamed, E. Al-Judaibi

Abstract:

Load forecasting has become in recent years one of the major areas of research in electrical engineering. Most traditional forecasting models and artificial intelligence neural network techniques have been tried out in this task. Artificial neural networks (ANN) have lately received much attention, and a great number of papers have reported successful experiments and practical tests. This article presents the development of an ANN-based short-term load forecasting model with improved generalization technique for the Regional Power Control Center of Saudi Electricity Company, Western Operation Area (SEC-WOA). The proposed ANN is trained with weather-related data and historical electric load-related data using the data from the calendar years 2001, 2002, 2003, and 2004 for training. The model tested for one week at five different seasons, typically, winter, spring, summer, Ramadan and fall seasons, and the mean absolute average error for one hour-ahead load forecasting found 1.12%.

Keywords: Artificial neural networks, short-term load forecasting, back propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2112
4182 Resources and Strategies towards the Development of a Sustainable Construction Materials Industry in Botswana

Authors: G. Malumbela, E. U. Masuku

Abstract:

The economy of Botswana has increased extensively since its independence. In contrast to this increase, the construction industry which is one of the key indicators of a developing nation continues to be highly dependent on imported building material products from the neighbouring countries of South Africa, Namibia, Zimbabwe, and Zambia. Only two companies in the country currently blend cement. Even then, the overwhelming majority of raw materials used in the blends are imported. Furthermore, there are no glass manufacturers in Botswana. The ceramic industry is limited to the manufacture of clay bricks notwithstanding a few studios on crockery and sanitary ware which nonetheless use imported clay. This paper presents natural resources and industrial waste products in Botswana that can be used for the development of sustainable building materials. It also investigates at the distribution and cost of other widely used building materials in the country. Finally, the present paper looks at projects and national strategies aimed at a country-wide development of a sustainable building materials industry together with their successes and hitches.

Keywords: Botswana construction industry, construction materials, natural resources, sustainable materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
4181 Optimization of Enzymatic Hydrolysis of Manihot Esculenta Root Starch by Immobilizeda-Amylase Using Response Surface Methodology

Authors: G. Baskar, C. Muthukumaran, S. Renganathan

Abstract:

Enzymatic hydrolysis of starch from natural sources finds potential application in commercial production of alcoholic beverage and bioethanol. In this study the effect of starch concentration, temperature, time and enzyme concentration were studied and optimized for hydrolysis of cassava (Manihot esculenta) starch powder (of mesh 80/120) into glucose syrup by immobilized (using Polyacrylamide gel) a-amylase using central composite design. The experimental result on enzymatic hydrolysis of cassava starch was subjected to multiple linear regression analysis using MINITAB 14 software. Positive linear effect of starch concentration, enzyme concentration and time was observed on hydrolysis of cassava starch by a-amylase. The statistical significance of the model was validated by F-test for analysis of variance (p < 0.01). The optimum value of starch concentration temperature, time and enzyme concentration were found to be 4.5% (w/v), 45oC, 150 min, and 1% (w/v) enzyme. The maximum glucose yield at optimum condition was 5.17 mg/mL.

Keywords: Enzymatic hydrolysis, Alcoholic beverage, Centralcomposite design, Polynomial model, glucose yield.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2237
4180 Designing Software Quality Measurement System for Telecommunication Industry Using Object-Oriented Technique

Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan

Abstract:

Numbers of software quality measurement system have been implemented over the past few years, but none of them focuses on telecommunication industry. Software quality measurement system for telecommunication industry was a system that could calculate the quality value of the measured software that totally focused in telecommunication industry. Before designing a system, quality factors, quality attributes and quality metrics were identified based on literature review and survey. Then, using the identified quality factors, quality attributes and quality metrics, quality model for telecommunication industry was constructed. Each identified quality metrics had its own formula. Quality value for the system was measured based on the quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). The system was designed using object-oriented approach in web-based environment. Thus, existing of software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Keywords: Software Quality, Quality Measurement, Object-oriented Approach, Net satisfaction Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2451